Wednesday, 23 September 2015

Running shell commands from awk

There are a few ways to run shell commands from inside an awk command. One is system(...) but it is better for handing values off to a program to do something, not the best way to get some data back from the shell command.

The best way to get data back from a command is to define the command as a variable then execute it piping the output to a new variable using the built in getline function.

The input of this example is a log file were the first field is epoch time and we need to see the time in human readable time format.



tail /var/log/my.log | awk ' {
   DC="date -d@"$1; 
   DC | getline T; 
   printf "%s\t", T; 
   for(i=2;i<NF;i++) {
     printf $i"\t"
   }; 
   printf "\n";
}'

In this example DC becomes the date command that is given the epoch value in variable $1.  T is the variable for the time as a string that we display using printf. Then a loop prints the rest of the data from each line of the log from the second filed to the NF, end number of fields.


Truncate a file with sed

Log files grow over time but you don't want to fill your disk. Often a log file is just for simple debug and does not need to be rotated or kept in /var/log/, sometimes a simple debug log just goes well in /dev/shm/. In this case it is important to keep it short and trim off the top of the file from time to time.

I don't know why but this problem seems very difficult for a lot of people and they end up writing long complex multi line blocks of code to truncate a file and just keep the end of it. It is super simple with sed.



# only keep the last 100 lines of the log file
sed -i /dev/shm/my_debug.log -e :a -e '$q;N;100,$D;ba'

Tuesday, 23 June 2015

Alpha sequence

When doing a quick little shell script the seq command is very handy to generate a list of numbers but all too often I find that I need a sequence of letters and there is no alphabetical version of seq. There are hundreds of examples on the Internet of how to do this in almost every script language you can think of, this is my version in bash.

This version has the added feature of being able to run up a sequence of upper or lower case letter. The script only accepts one argument of the last letter, but could easily be expanded to do all sorts of extra tricks.


#!/bin/bash

OUT=""
TARGET=$(echo ${1:0:1} | grep -i "[a-z]")
UPPER=$(echo $TARGET | tr "[a-z]" "[A-Z]")
START=97 # a
[ "$TARGET" = "$UPPER" ] && {
 START=65 # A
}
while [ "$OUT" != "$1" ]; do
 OUT=$(awk 'BEGIN{printf "%c",'${START}'}')
 echo -n "$OUT "
 let "START=$START+1"
done


Examples:


$ alphaseq J
A B C D E F G H I J

$ alphaseq m
a b c d e f g h i j k l m

Tuesday, 21 April 2015

wget script hacking the Cisco DPC3825

My ISP at home has some of the best prices on high speed Internet around. They can do this because they only use the most budget of all the junky equipment small amounts of money can buy. Ever since I first subscribed my home router (CATV modem) will progressively get slower and slow with decreasing Wi-Fi range until I go through the steps to login to the web interface and click "Reset" or pull the power and plug it back in. After that it is good for a few days. The problem with that is it dies at the worst time. I have complained for many months but the only thing that happens is they replace the junk modem with another junk modem.


The best way to solve this is to create a wget script that cron will run around 4:00 AM when no one is using the Internet. The script will reboot the modem and have it ready for me to use problem free all day.

wget is a very powerful command line web client. Reading the source file of the page you want to script you can automate almost any action.


#!/bin/bash

wget \
  --save-cookies=cookies \
  --post-data="username_login=cusadmin&password_login=yourpassword" \
  http://192.168.0.1/goform/Docsis_system -O /dev/null 2>&1

wget \
  --load-cookies=cookies \
  --post-data="devicerestart=1" \
  http://192.168.0.1/goform/Devicerestart -O /dev/null 2>&1

Thursday, 9 April 2015

svn log to RPM change history, the Ruby way

This is the Ruby version of the AWK script to convert SVN logs to a change history report suitable for use in an RPM spec file.

I tried to keep the code very similar to the AWK version but there are many ways this code could be optimized to reduce the number of lines.


#!/usr/bin/ruby

require 'date'

nextRow = 0
lastDate = ""
lastRev = ""
newDate = ""
newRev = ""
docs = Array.new

$stdin.each_line do |l|
 if l.include?("-"*72) 
  nextRow = 0
 else
   if nextRow > 0
    if l.chomp.length == 0
     nextRow += 1
    else
     docs.push(l)
    end
   else
    $Sl = l.split
    newRev = $Sl[0]
    newDate = $Sl[4]
    nextRow += 1
    if lastDate != ""
     if lastDate != newDate and docs.length > 0
      printf "* %s Revision %s\n", Date.parse(lastDate).strftime("%a %b %d %Y"), lastRev
      docs.each { |dl|
       puts "- #{dl}"
      }
      puts
      docs.clear
     end
    end
    lastDate = newDate
    lastRev = newRev
   end
 end
end

if docs.length > 0
 printf "* %s Revision %s\n", Date.parse(lastDate).strftime("%a %b %d %Y"), lastRev
 docs.each { |dl|
  puts "- #{dl}"
 }
end

Friday, 13 March 2015

Super Simple Linked List In C

When writing code in a high level scripting language like Perl, Ruby or Python it is easy to take for granted what a nice feature it is to have automatic types like arrays or hashs. Variable that you can just dump data in to and pull it back out any way you like. These features were not always there, someone had to write the code to make it to this.

A linked list in C is a way to store values, as many as you like for as much memory as you have. Memory for a variable is created on the fly as needed and stored in a location that knows information about it's neighbour.  A good linked list will have features to let you search, insert, delete from any point, move backwards and forwards in the list and most importantly a good list would protect your code from accidental memory bugs. My example today has non of those features.

This is the bare minimum of a simple linked list. It is a single linked list so you can only move forward. A double linked list lets you move backwards also.
#include<stdio.h>
#include<stdlib.h>

struct DataNode {
 void *Load;
 struct DataNode *Next;
};

struct DataNode *TheFirst = NULL;
struct DataNode *Current = NULL;
struct DataNode *Last = NULL;

void* NewList(void *Load) {
 struct DataNode *Pointer = (struct DataNode*)malloc(sizeof(struct DataNode));
 if(NULL == Pointer) {
  return NULL;
 }
 Pointer->Load = Load;
 Pointer->Next = NULL;

 TheFirst = Last = Current = Pointer;
 return(Pointer);
}

void* Add(void *Load) {
 if(TheFirst == NULL) {
  return (NewList(Load));
 }

 struct DataNode *Pointer = (struct DataNode*)malloc(sizeof(struct DataNode));
 Pointer->Load = Load;
 Pointer->Next = NULL;

 Last->Next = Pointer;
 Current = Last = Pointer;
 return(Pointer);
}

void* First() {
 Current = TheFirst;
 if(Current == NULL) return(NULL);
 return(Current->Load);
}

void* GetNext() {
 struct DataNode *Pointer = Current;

 // Advance to next
 if(Pointer->Next) {
  Current = Pointer->Next;
  return(Current->Load); 
 }
 return(NULL);
}

int DeleteFirst() {
 int Return = 0;
 if(TheFirst) {
  if(TheFirst->Next) {
   Current = TheFirst->Next;
   Return = 1;
  } else {
   Current = NULL;
  }
 }
 free(TheFirst);
 TheFirst = Current;
 return(Return);
}

int main(void) {
 char* ToDoList;
 Add("Pet the dog");
 Add("Feed the fish");
 Add("Let the cat out");
 Add("Do the dishes");
 Add("Gas up the car");
 Add("Watch some TV");
 ToDoList = First();
 while(ToDoList != NULL) {
  printf("%s\n", ToDoList);
  ToDoList = GetNext();
 }
 while(DeleteFirst()) {}
 return 0;
}

Tuesday, 3 March 2015

Instant hash in Perl, just do the splits

Today I needed a Perl script that could create a lot of hash cells fast and the names of the cells will not be know until run time. I did not want to use a for loop, I looked for a better way to do this in less code. When I found the solution I was reminded of some Perl documentation that I read saying that most people think the @ symbol means an array but it does not, it means many. The way this works it when you reference the hash with the @ symbol you can put many values in to many cells in one line of code.

The first split turns the Fields string in to the cell name identifiers, the second split turns the Values string in to the data, putting many data in to many cells named in a hash.



#!/usr/bin/perl

my $Fields = "name:address:phone";
my $Values = "Bob:123 Hill Rd:505-050-4321";
my %Hash;
@Hash{split(/:/, $Fields)} = split(/:/, $Values);

for my $key (keys %Hash) {
 print "$key = '$Hash{$key}'\n";
}

Thursday, 26 February 2015

Live firewall hacking

I am a bit of a jerk when it comes to security. If I find out that you emailed your private SSH key, I will remove your public key from everywhere. A lot of people think I am paranoid, I call those people "low hanging fruit".

When it comes to the Linux firewall I never want to restart or restore on a public facing server. There is a long list of bad things that can happen when your firewall is down for just a second.

The correct thing to do is edit just the lines in your firewall rule that you need to change. We can do this in iptables using options like --line-numbers to give us the exact line to alter and --replace or --delete to modify that line.


iptables -v --replace INPUT $(iptables -nvL INPUT --line-numbers | grep ^[1-9] | grep "4.3.2.6" | awk '{print $1}') -s 6.2.3.4 -d 4.3.2.6 -j ACCEPT
ACCEPT  all opt -- in * out *  6.2.3.4  -> 4.3.2.6  
iptables -nvL INPUT
Chain INPUT (policy ACCEPT 238 packets, 69612 bytes)
 pkts bytes target     prot opt in     out     source               destination         
    0     0 DROP       all  --  *      *       1.2.3.4              4.3.2.1             
    0     0 ACCEPT     all  --  *      *       6.2.3.4              4.3.2.6             
    0     0 DROP       all  --  *      *       6.9.3.4              4.3.9.6             
iptables -v --replace INPUT $(iptables -nvL INPUT --line-numbers | grep ^[1-9] | grep "4.3.2.6" | awk '{print $1}') -s 6.2.3.4 -d 4.3.2.6 -j DROP
DROP  all opt -- in * out *  6.2.3.4  -> 4.3.2.6  
iptables -nvL INPUT
Chain INPUT (policy ACCEPT 18 packets, 2545 bytes)
 pkts bytes target     prot opt in     out     source               destination         
    0     0 DROP       all  --  *      *       1.2.3.4              4.3.2.1             
    0     0 DROP       all  --  *      *       6.2.3.4              4.3.2.6             
    0     0 DROP       all  --  *      *       6.9.3.4              4.3.9.6             
iptables -v --delete INPUT $(iptables -nvL INPUT --line-numbers | grep ^[1-9] | grep "4.3.2.6" | awk '{print $1}')
iptables -nvL INPUT
Chain INPUT (policy ACCEPT 12 packets, 1631 bytes)
 pkts bytes target     prot opt in     out     source               destination         
    0     0 DROP       all  --  *      *       1.2.3.4              4.3.2.1             
    0     0 DROP       all  --  *      *       6.9.3.4              4.3.9.6  

Tuesday, 24 February 2015

Running a tight BASH script

A few things I like to do with important bash scripts so that they live on as useful tools for many years is:
1) Send error output to the standard system logger.
2) Clean up any child processes that were started
3) Clean up any temporary files


You don't always need all of these but they are good to have handy.

Notice how trap is used to call a function so that the exit of the script can do a few extra things.

#!/bin/bash

# Log any errors to the standard system logs
exec 2> >(logger -s -t $(basename $0))

# Clean up when the program exits
function CleanExit {
 # stop any long running commands
 for k in $(jobs -p); do { kill -p $k; }

 # remove any temporary files created
 # rm -f $TEMPFILE
 exit
}
trap "CleanExit" EXIT

# Your code goes here

Friday, 13 February 2015

Get values and catch exit code in one line of BASH

Often it is helpful if a variable is only used when a command is successful.


FILE_SIZE=$(stat --format="%s" /bin/bash) && { 
 echo "Size is $FILE_SIZE"; 
} || { 
 echo "File size is unknown."; 
}


This becomes tricky if you used a pipe to parse the result because the pipe will fork a new shell completely isolated from the original command. Using PIPESTATUS it is possible to check the exit code of the previous command to the left and then it is easy to test the result.

HOST_IP=$(host $SOME_HOST_NAME | awk '{print $NF}'; [ ${PIPESTATUS[0]} -eq 0 ] ) && {
 echo "Host IP is $HOST_IP"
} || {
 echo "no such host"
}

Thursday, 12 February 2015

Unit convertion wih awk

I needed a simple unit converter to change number to kilo, mega and giga to I went looking and found some large complex code and decided I could make it so must smaller.

This will convert your number to a human readable value.

Try this with it.
for i in $(seq 1 1 64); do echo -n "$(echo "2^$i"|bc) =  "; echo "2^$i"|bc|awk -f convert.awk; done


{
 S=$1;
 Us=" kMGTPEZY";
 U="";
 i=2;
 while(S>1024){
  S/=1024;
  U=substr(Us,i,1);
  i++;  
 };
 printf "%0.0f%sb\n", S, U
}

Wednesday, 11 February 2015

Perl named capture containers, the coolest parsing trick ever.

Named capture containers only works in Perl 5.10 and up. In this example I am able to match lines in a log file by a key word. The tags like <time> become the key name in the hash %+.

Values can be extracted like you would with a normal hash.



while (<>) {
 $Line = $_;
 $Line =~ s/\n//;
 $Line =~ m/^(?<time>\d{10}.\d{3})\s*(?<status>connection)\s*(?<fd>\d*)\s*(?<ipaddress>\d*.\d*.\d*.\d).*/ ||
 $Line =~ m/^(?<time>\d{10}.\d{3})\s*(?<status>disconnect)\s*(?<fd>\d*).*/ ||
 $Line =~ m/^(?<time>\d{10}.\d{3})\s*(?<status>monitor)\s*(?<fd>\d*).*/;
 if(keys(%+) > 0) {
  foreach my $KeyName (keys %+) {
   $x{$KeyName} = $+{$KeyName};
  }
 }
}

Tuesday, 10 February 2015

Learning to program: What is the best language to learn?

A lot of people ask me what is the best programming language for them to use. People new to coding think this is a good question to ask an expert but it actually a silly question. It is about as helpful as asking me what I think you should eat for lunch. As a new developer you should pick a language that helps you learn the steps of procedural thinking but unencumbered by all the extra steps that can be very daunting. Build your confidence with a simple scripting language, and there are a great many to choose from. As you further along you will find that choosing the right language has more to do with what the end product needs to be and less with what your skill level is. I have experience with dozens of programing technologies. Each one I had to learn because my skills needed to fit the project. You should never make the project fit your skills or your project is doomed.

Here are some notes about scripting and programming that may help you get started.

Shell Scripts
This is about as simple as it gets. If you can run the ls command then you have the skill to put it in to a file and run it as a script.

The best thing with shell scripts is that it is all the power of the commands that are in the Linux OS, and that is the meat and potatoes of a good shell script.

You wouldn't use Perl to do a lot of OS commands, your code would be just a bunch of system() calls.

Perl Scripts
Perl scripts will run much faster then shell scripts and are better at parsing data. There are many libraries for doing almost any thing you can think of, but they will require some extra skill. Perl by itself is going to be only slightly more interesting then a shell script. If need more power then a shell script but you are not ready for Perl, take a look at Ruby or Python. They have become very popular with new programmers for their friendly learning curve and easy of use.

C & C++
These are languages that will compile your code to binary. C came first and has a long history of being the Ox of the computer world, a bit stupid but works hard. C++ is the descent of C. If C is like an Ox then C++ would be like a John Deere tractor with air conditioning and a stereo. C++ is an object orientated language. You will hear about objects a lot in high level languages, it is just a word that means a block of code or a library that can be used. An object is a collection of tools that do some features for you so you do not need to re-invent the wheel yourself, someone else has done the work and all you need to do is connect the objects together to make them in to your own program. C++ objects try to be smart, they try to protect you from using them in ways that could break your program.

C programs are smaller and faster but harder to write because you need to write mode code from scratch. You use C when you need small and very fast, you don't use C when your project is large and complicated.

C++ programs are larger and just a little bit slower because of the objects they have added to them. You use C++ for almost any project large or small. Many programmers will never use C after learning C++.

HTML
HTML is not a programming language. Most people think that it is not a language because it is not procedural but I have worked with languages that are event state and not procedural at all, yet still a language. Is HTML not a language because it is a static layout, no because CSS can make the content fully dynamic. Add JavaScript and HTML can become a great way to build the interface for your project. HTML is not a real programming language because at best it is only half of the code you will need. Unless your building a toy you are going to need something on the other side to communicate with, that will be a web server. I would not recommend first time programmers pick up HTML for their first language because of this. HTML is very easy to write and you can make it some some fun stuff but eventually you will need a server running PHP, Perl, Python, Ruby, C, C++ or event Shell Scripts to handle the server side of this program.

Monday, 9 February 2015

Bash: Embed functions in commands.

Sometimes you want to do a complex action with your shell script but you don't want the mess of creating another shell file to call externally.

You can export the function and use it much like a call back.


#!/bin/bash

function Stats {    
    FILE="$@"
    STAT=$(stat --format="%s" "$FILE")
    echo "File $FILE is $STAT bytes"
}
export -f Stats
find . -type f -exec bash -c 'Stats "{}"' \;

Friday, 6 February 2015

Extracting named columns of data with awk

Some times when a vendor updates a diagnostic tool they change the order of the fields so it is not a good idea to extract columns of data by index. This simple awk script is able to get data by the name of the column rather then the index.

The samples contain the same data but the columns are in different orders. The results from both files should be the same.

samp0.txt
id;width;rank;age;height
1;900;4;20;500
2;1900;11;32;200
3;70;8;43;50

samp1.txt

id;age;height;width;rank
1;20;500;900;4
2;32;200;1900;11
3;43;50;70;8

GetField.awk

BEGIN { 
 FS=";"
 C=0
} {  
 if((C > 0) && ($C!="")) {
  print $C
 }
 
 if(C==0) {
  for(i=1; i<=20; i++) {
   if($i == ARGV[2]) {
    C=i
   }
  }
 }
}

Run the commands
]$ cat samp0.txt | awk -f GetField.awk - age 2>/dev/null
20
32
43
]$ cat samp1.txt | awk -f GetField.awk - age 2>/dev/null
20
32
43

Thursday, 5 February 2015

Star, Open, Libre. Automating your spreadsheets

I started using StarOffice when it first came out and it was good. OpenOffice was great and did everything I ever needed. When Oracle bought Sun Microsystems the developers left to create LibreOffice. OpenOffice was released from Oracle and now lives in the Apache Software Foundation. Some day the two may get unified but for now LibreOffice is the popular choice, and it is great.

I love to create spreadsheets and often offer to help my co-workers with automating their scalc files for them. I have a lot of fun writing StarBasic macros. With a little work the StarBasic macros can be modified to work in other office programs.

Two functions that I add to almost every spreadsheet are GetCell and GetDataCell to make it easy and fast to read and change the cell contents.


Function GetDataCell(SheetName As String, DataName As String, Row) As com.sun.star.table.XCell
 AllSheets = ThisComponent.Sheets()
 FindSheet = AllSheets.GetByName(SheetName)
 For ALoopCounter = 0 to 1000 Step 1
  TheCell = FindSheet.GetCellByPosition(ALoopCounter,0)
  If TheCell.String = DataName then
   TheCell = FindSheet.GetCellByPosition(ALoopCounter,Row)
   Exit for
  End if
  If TheCell.String = "" then
   MsgBox("Could not find DataField = "+DataName)
   Stop    
  End if
 Next ALoopCounter 
 GetDataCell = TheCell
End Function


Function GetCell(SheetName As String, Column, Row) As com.sun.star.table.XCell
 AllSheets = ThisComponent.Sheets()
 FindSheet = AllSheets.GetByName(SheetName)
 TheCell = FindSheet.GetCellByPosition(Column,Row)
 GetCell = TheCell
End Function

Wednesday, 4 February 2015

Bash me with a pipe

I love to use the pipe. It is a simple tool that brings so much power. So this is how you can take things to the next level.

The snippet below shows how to redirect STDOUT to the STDIN of one script and the STDERR to the STDIN of another.

You cannot use a pipe for both because it will only redirect the file descriptor connected at fd1, normally STDOUT unless you change it. When you use a pipe everything to the right becomes a different program, so STDERR needs a different solution.

The solution is to redirect fd2 before you use the pipe on fd1.

$ crontab -l
# This cron entry sends STDERR to MyErrorLogger.sh and STDOUT to MyLogger.sh
* *  *  *  * ./MyScript.sh 2> >(MyErrorLogger.sh) | MyLogger.sh



$ cat MyLogger.sh 
#!/bin/bash
# This takes STDIN and sends it out to a file that has the date in its name
cat - >> mylog_$(date +%F).log 



$ cat MyErrorLogger.sh 
#!/bin/bash
# This takes STDIN and sends it out to a file that has the date in its name
cat - >> myerrors_$(date +%F).log



$ cat MyScript.sh
#!/bin/bash
# This sends sample text to both STDOUT and STDERR
echo "Hello world!" | tee /dev/fd/2

Tuesday, 3 February 2015

Perl standard deviation

First of all, yes there is a module for this, but what fun is that. I needed to do a standard deviation calculation in a production system where we have a custom stripped down Perl and I did not have time to get change control for installing a new RPM. I don't need change control to have my own Perl script in my home directory and it is so simple. Also I love writing math code, when ever possible I try to use a bit mask test rather then big lists of if conditions.

Perl is one language that it bittersweet to me. It is very powerful, has endless ability, millions of modules that add every possible feature you could want. My only complaint is that it is so flexible that it creates the perfect environment for bad code habits. Most of the worst code I have ever seen was done in Perl, and some of it was my code.

Coding in Perl is fun and it is a very powerful tool. If you have not done any Perl coding you should give it a try. For now here is my simple code to calculate standard deviation.

#!/usr/bin/perl
use strict;

#Prevent division by 0 error in case you get junk data
exit undef unless(scalar(@ARGV));

# Step 1, find the mean of the numbers
my $total1 = 0;
foreach my $num (@ARGV)
{
        $total1 += $num;
}

my $mean1 = $total1 / (scalar @ARGV);

# Step 2, find the mean of the squares of the differences
# between each number and the mean
my $total2 = 0;
foreach my $num (@ARGV)
{
        $total2 += ($mean1-$num)**2;
}
my $mean2 = $total2 / (scalar @ARGV);

# Step 3, standard deviation is the square root of the
# above mean
my $std_dev = sqrt($mean2);
printf "%0.2f", $std_dev;

Monday, 2 February 2015

That's what I sed!

The power of the Linux CLI comes from the ability to connect thousands of programs together and make new tools. The most commonly used connector is the pipe. | It lets you send the output "STDOUT" from one program to the input "STDIN" of another. The sed program is great for taking input and changing it. The name sed is short for Stream EDitor, the stream is the data that flows in to and out of the program. sed can also edit files in place.

I use a command like this to edit the XML configuration files of Tomcat servers to point the Java program to new MySQL servers.



sed -i /usr/share/tomcat6/conf/server.xml.new 
-e '/Context docBase="someblock"/,/<\/Context>/{s/url="jdbc.*mydb/url="jdbc:mysql:\/\/'$DB_HOST':3306\/mydb/}'

Friday, 30 January 2015

CLI Tools: screen

One of my favourite CLI tools is screen. Basically it lets you detach from your shell session and it keeps running. You can reattach to it later from anywhere, start your shell from a local TTY then reattach via SSH. You can even share your session and have others join in so you can collaborate on shell.

The sample code for today is how to make an init script to run your program as a detached daemon using screen. The cool thing about this is it lets you connect to it at any time to see what it is doing. This is not recommend for normal use but it is a great way to debug that new program before you commit to making it a proper daemon.


#!/bin/bash

cd /etc/init.d

start() {
 echo "Starting Service: "
 APP_CMD="/usr/local/bin/MyProgram.pl"
 screen -dmS pmsgd $APP_CMD
 echo
}

stop() {
 echo "Cannot stop screen programs this way. Open the screen sessions."
 echo
}

restart() {
 stop
 start
}

# See how we were called.
case "$1" in
  start)
 start
 ;;
  stop)
 stop
 ;;
  restart)
 restart
 ;;
  *)
 printf "Usage: %s {start|stop|restart}\n" "$0"
 exit 1
esac

exit 0

Thursday, 29 January 2015

Find zen for your users with Zenity

"That is a great program you made for us but we don't like that command line thing. Can you make it a GUI?" To which you reply, "Why yes I can. Just give me a few minutes."

Shell scripts are great. You have all the power from thousands of Linux commands that you can connect together to make the most impressive scripts since the ENIAC. No matter how great your scripts are there will be users that just don't like it because of the command line.

You have three options:  create a web server interface,  re-write it as a binary with graphical objects or use pre-existing dialog tools to add GUI features to your program. Zenity can do that for you.

There are a few dialog tools like zenity, kdialog, xdialog and may more. They are not very hard to use and can provide a lot of helpful iteration with things like information boxes, input boxes, selections and radio buttons.


zenity --entry --text "Please enter your name"

Wednesday, 28 January 2015

Daemonize your code

If you write a shell script or program and need to fork it to the background as a daemon there could be some issues. The normal standard daemon library may not be able to totally disconnect from any open file descriptors. This simple C++ program can do it all for you and it is so easy to use, just put this program in front of your normal command and it will turn any normal program in to a system level daemon.


#include <sys/types.h>
#include <sys/stat.h>
#include <stdio.h>
#include <stdlib.h>
#include <fcntl.h>
#include <errno.h>
#include <unistd.h>
#include <syslog.h>
#include <string.h>

int main(int argc, char *argv[]) {
 int i;
 // int reterr;
 pid_t pid, sid;
 
 //Fork the Parent Process
 pid = fork();
 
 if (pid < 0) { exit(EXIT_FAILURE); }
 
 //We got a good pid, Close the Parent Process
 if (pid > 0) { exit(EXIT_SUCCESS); }
 
 //Change File Mask
 umask(0);
 
 //Create a new Signature Id for our child
 sid = setsid();
 if (sid < 0) { exit(EXIT_FAILURE); }
 
 //Change Directory
 //If we cant find the directory we exit with failure.
 if ((chdir("/")) < 0) { exit(EXIT_FAILURE); }
 
 //Close Standard File Descriptors
 close(STDIN_FILENO);
 close(STDOUT_FILENO);
 close(STDERR_FILENO);
 
 //----------------
 //Main Process
 //----------------
 for(i=0; i < argc - 1; i++) {
  argv[i]=argv[i+1];
 }
 argv[argc-1] = '\0';
 execv(argv[0], argv);
 //reterr = execv(argv[0], argv);
 //printf("execv failed with '%s'\n", strerror(errno));

 //Close the log
 closelog ();
}