## Monday, November 01, 2010

### Ubuntu Update Manager's Proxy does not change when system proxy is changed

All efforts to change the Update Manager's proxy come to a grinding halt without any solution. The reason for this fallout is that the proxy is "hidden" at a very safe location. Ubuntu Update Manager's proxy does not change after changing the system-wide proxy or proxy of synaptic program. The reason is that it uses the apt-get's proxy settings. Which can be amended by looking at the /etc/apt/apt.conf file. This file contains the information about proxy servers.
This information can be amended to gain direct access to the internet or if the need is to change to a different proxy server.

## Saturday, October 30, 2010

### The easiest SVN tutorial ever

Browsing through the Internet its extremely difficult to find  the subversion(svn) commands in an instance and be productive immediately. So here is the step by step list of instructions which can be followed to accomplish the most common tasks of svn.

We would use the project on Sourceforge named Dr. Java as an example.

If the project is already there and you wish to CHECKOUT some code to view here is the command.
$svn co https://drjava.svn.sourceforge.net/svnroot/drjava drjava Now if you wish to add new file to the svn repository the command is very simple. For example if you wish to add the text file readme.$svn add README_TEXT
The above command will not add it to the server but add it to the schedule. If you wish to view current status of additions, deletions that are scheduled you could use the command
$svn status And finally if you are satisfied simple do the following and the commit will be done meaning all the changes will now be part of the repository and a new version will be assigned. REMEMBER adding/ deleting/ changing code in a repository requires permissions. Therefore you might be prompted for passwords during the process.$svn commit
This is all the information that you need to get past the basic svn commands and concentrate on your project.

Now that you have gone through one cycle it would be easier to use the following command to explore other features of svn.
$svn help ## Tuesday, October 26, 2010 ### MySQL quickest way to look at databases Looking for MySQL's administration tool is a task. Installing and configuring PHPMyAdmin would take Apache, php and then successful configuration. Then what to do to look at the tables, databases in the fastest possible way to look at tables and execute queries.$ mysql -u username -p
the command will prompt for a password
mysql> Show databases;
mysql> show tables;
The above two commands are extremely useful when one wants to look at the databases installed and tables in individual databases.

For example to look at users tables in database drupal the following commands will suffice.

mysql> use  drupal6;
mysql> select * from users;

These commands can be used in both Linux and Windows and are very handy and dont require much to do.

### PHP Issue :: preg_split instead of explode function

If the need is to parse a string and extract the words. Then its advisable to use preg_split instead of using the explode function.

Code follows :
preg_split ("/\s+/",$textInput); where the first argument is the regular expression for finding one or more spaces and the second argument is the input text. This option works far better than the explode(" ",$textInput).

## Tuesday, September 21, 2010

### Judging the Joomla Jungle :: The Premier CMS

First experience of using a comprehensive Content Management System (CMS) was Drupal way back some years ago. Before that content management meant either using Front Page or direct FTP of the webpages. Despite "hearing" a lot of good reviews about Joomla never really got the time to explore it.

First Steps
The first steps towards installing Joomla in Linux are the usual for web servers. Install LAMP (Linux, Apache, MySQL and PHP).  After testing LAMP's successful installation proceed with the following.

Download Joomla from the website and follow the instructions given in INSTALL.PHP file in the main folder. Copying the instructions from the php file verbatim here. Copy the folder into www directory of Apache.

{verbatim copy starts}
First, you must create a new database for your Joomla! site e.g.

$mysqladmin -u db_user -p create Joomla MySQL will prompt for the 'db_user' database password and then create the initial database files. Next you must login and set the access database rights e.g.$ mysql -u db_user -p
Again, you will be asked for the 'db_user' database password.  At the MySQL prompt, enter following command:
GRANT ALL PRIVILEGES ON Joomla.*
where:
'Joomla' is the name of your database
'nobody@localhost' is the userid of your webserver MySQL account

If successful, MySQL will reply with
Query OK, 0 rows affected
to activate the new permissions you must enter the command
flush privileges;
and then enter '\q' to exit MySQL.
Alternatively you can use your web control panel or phpMyAdmin to create a database for Joomla.

{verbatim copy ends}

So once the installation was complete simply go to the http://localhost/joomla (depending on where you stored the joomla in the apache directory. This would launch the web installer and answering a couple of simple questions results in the installation of Joomla.

Ooops where are my other databases !!!!
Having installed and worked on Joomla I realized that my other databases have been lost. Moreover the admin password has been lost. So I had to recover the  password again via the instruction given here. This glitch created  a lot of headache but finally after recovery of the admin password.

Using the CMS
The result of this installation procedure was a sleek joomla content management system. Its extremely easy to get started and actually use via the admin panel by going to the http://localhost/jooomla/administrator.
The whole website can be managed through a set of menus and control panel. There is the template manager which helps in installation of new templates as well as edit/ preview/ apply templates. The other important aspect is easy maintenance of posts. The CMS is nice although it takes time to figure out the whole concepts. In contrast to Drupal (which was really easy to manage via the web interface of admininstrator account) Joomla lacks a bit of such flexibility.

Next Steps
The next steps is installation of themes and extensions. In this regard the following seems attractive and worthy of exploration.
• Financial Extensions
• Site Analytics

## Tuesday, August 17, 2010

### Learning Probability via Octave

Probability is a subject which brings everyone sleepless nights. Octave, the MATLAB clone or for some MATLAB wanna-be . GNU Octave is a high-level language, primarily intended for numerical computations. The best way to learn octave is to take a difficult task (or goal) and then start using it.

Octave Basics
Octave can be installed using the following command in Ubuntu.
$sudo apt-get install octave Once installed octave can be invoked via the command.$ octave
Its best to understand the modus operandi of Octave. Like lists are the primary data structures in LISP Language. In octave vectors are at the core.  Vectors are used to store information, manipulate information via vector algebra and display data still using 2D/3D graphics. Applications of octave are huge but we will use probability as a starting point and see how we can learn better probability using the computational and graphical features of octave.

Uniform Probability Distribution
Uncertainty of events can be modeled using concepts of probability. If a set of events is equally likely to occur then we say that the events have equal or uniform probability of happening.
The first step towards developing a mathematical model for probability is to map real-life events to random variables (in our case the random variables will take on values from real numbers). Thus we say that instead of using real-life events we will refer to values 1, 2, 3, 4, .... when talking about events (e.g., Sun rise, Sun Set etc.). Lets use the example of a dice having six faces each one marked with number 1, 2, 3, 4, 5 and 6 respectively. The events of getting a particular face after rolling the dice are getting face marked by either One, Two, Three, Four, Five or Six. Since we cannot say with certainty  which face will come up on  roll of  dice. We say that x which is a random variable which takes on values {1, 2, 3, 4, 5, 6} corresponding to the aforementioned events.
Now this information can be stored as a vector x = [1 2 3 4 5 6] and so lets see the corresponding command in octave by simply typing x=[1 2 3 4 5 6].
octave:1> x=[1 2 3 4 5 6]
x =

1   2   3   4   5   6
The following values x = and followed by 1 2 3 4 5 6, shows the value stored in the vector x. This display can be avoided by placing a semi-colon (;) at the end of the command. Now that we have declared a variable named x with values lets see how octave stores it. This can be achieved anytime by writing the command whos on the command line of octave.
octave:2> whos
Variables in the current scope:

Attr     Name        Size                         Bytes       Class
==== ====        ====                     =====  =====
ans         1x30                             30        char
x           1x6                               48        double
The size of variable x is 1x6 which means that its a vector (matrix) of size 1 row and 6 columns, with 48 bytes needed to store it while the class of variables in double.

Now lets see what is the probability of getting a particular face. Since all the faces of the dice have an equal probability of showing up hence the probability of getting a 1 is equal to the probability of getting 2, 3, 4, 5 or 6. Therefore in our model the probability given by p(x) = 1/6 {where 6 represents the total number of possibilities}. Now lets calculate the probability of the events using octave.
octave:3> px=1/6*ones(1,6)
px =

0.16667    0.16667    0.16667    0.16667    0.16667    0.16667
Here in the above code we have used the function ones which helps to declare a vector of size 1x6 and initialize it with ones. We simply multiply it with 1/6 to get 0.16667 in all the elements. In order to check what is the probability of event 3 simply write the following command.
octave:4> px(3)
ans =  0.16667
Similarly for all the other events.  This was simple because we used the simplest possible probability model.

Normal Probability Distribution
Moving further lets move to a more "realistic" probability model. Normal Probability Distribution is a very practical to model most situations. Special thanks to Guillaume Riflet's website for the equations in this article.
Trying out this equation in octave gives the following resulting. Now using the same number of events x=[1 2 3 4 5 6]. The mean (mu)=3.5 and variance(sigma)=1.
octave:5> mu=3.5; sigma=1;
octave:6>  fx=(1/sqrt(2*pi*sigma^2)*exp(-(x-mu).^2/(2*sigma^2)));
fx =

0.017528   0.129518   0.352065   0.352065   0.129518   0.017528
Since the mean was selected to be 3.5 therefore we can see that the probability of 3 and 4 are maximum here while the probability decreases as we move away from 3.5 on either side. Now this is the classical "bell shaped" or Gaussian  curve. This does not look clear because we took only 6 data points to plot the curve we can improve the points by using the following code and the resulting graph is shown.
octave:8> x=[1:0.1:6];
octave:9>  fx=(1/sqrt(2*pi*sigma^2)*exp(-(x-mu).^2/(2*sigma^2)));
octave:10> plot(x,fx)

Now that the data points have been increased the graph is much more smooth and looks like a bell shaped curve and the probability is now really maximum at point 3.5 as should be since mu (mean)=3.5.

Multivariate Probability Distribution

In order to generate further interest now lets look at multivariate normal distribution. The following equation gives us the multivariate normal distribution here the x and mu are d-dimensional vectors defining the multidimensional event and means while sigma is the co-variance matrix.

The above equation requires further constructs like for-loop to be implemented in Octave lets use the example given at wiki to learn.
octave:10> mu=[40, 60]; sigma=[100, 30; 30, 140]; octave:39> isigma = inv(sigma);
octave:11> detsigma = det(sigma);
octave:12> coeff = 1/(2*pi*sqrt(detsigma));
octave:13> for i=1:100
>    for j=1:100
>        x = [i;j];
>        xm = x - mu;
>        p(i,j) = exp(-0.5*xm'*isigma*xm);
>    end
> end
octave:14> p=p*coeff;
octave:15> [X,Y]=meshgrid(1:100,1:100);
octave:16> surf(X,Y,p);

The plot above is a 3-D graph of the multivariate normal probability distribution of p(x,y) depending on two variables x and y. The above methodology helps us learn probability in a fun manner as well as get familiar with octave's computational and graphical features.

## Friday, July 02, 2010

### Best web hosting companies :: A futile search without a plan and execution

Web hosting services is a thriving business and new services keeps propping almost daily. Now with every new choice new questions need to be asked. Which web hosting service should I go for? The answer to this question does not lie in any search engine queries but rather lies in a plan. In order to determine the best web hosting companies one needs to have a plan and an execution map.

Plan Phase
Selection of web hosting service is easier if answers to the following questions can be determined.
1. Why do we need a web hosting service?
2. What is the goal of the website?
3. What will be our needs in the next three years?
4. Which departments in the company are planning to use the website in the next few years?
5. Is our company's business uniform enough to warrant a company-wide website?
6. How much is the company willing to invest? What is the expected budget of this endeavor?
7. Expected automation needs of the company in the next three years? Calenders, Emails other needs.
8. Maximum number of users(departments) who will upload data on the website expected?
The importance of these questions varies from company to company but every company must ponder over more or less each of these questions. This helps in preparing a map for searching the best web hosting company for our purposes.

Execution Phase
After pondering over each of these questions we need to heed to the following points which will filter out and help focus.

1. Throwing money at it wont make the problems go away.
Every one is interested in offering a service at a cost. Don't get lured by low costs solutions or for that matter overly priced solutions. Look before you leap. Expensive solutions not necessarily mean stable and robust solutions.

2. Linux is the best
Yes, Linux is the best but is it best for you depends on the environment in your enterprise. If Windows and Microsoft are the name of the game in the organization then choosing Linux for website hosting might be counter productive in that it would not fit in with the software being used currently e.g. MS Project, MS Office, MS Exchange Server, MS Email Clients like Outlook. The difference in prices between Linux and Windows web hosting services does not warrant putting all the investment already done in Microsoft software to disuse.

3. My cousin's company gives the lowest rates
Avoid falling into buying service on referrals which are not based on purely business terms. If someone refers to a company whose only 'positive' point is that its owned by the referrer's relative than this is probably a huge drawback instead of an advantage. One might not be able to fully negotiate the terms and in the event of problems really push the company to provide support.

4. My web hosting company performs really well
Avoid giving contract to a company just because it works well for someone else. The reason is that your company's needs might be completely orthogonal to the other party.

5. Traffic does not matter at all
Give proper attention to the traffic caps in the hosting plans. Never fall for the trap that your traffic needs six months from now will remain the same as they are today. Websites have a strange pattern of jumping from  single digit hits to hundreds (or at times thousands) of hits per day.

Future Decisions
Once a web hosting plan has been selected and the website has been deployed. Its imperative to perform monitoring of the website as much as possible. It helps in avoiding problems as well as planning growth of the website. A first step could be registering the website with analytics service providers. Google analytics is one example of such a service. Second step is to get the website report from Alexa. The third optional step could be to buy monitoring software like one developed by Hubspot.

### Why Operating Systems Programming is so difficult?

The character of operating system programming has remained more or less constant over the years. That is a polite way of saying that it is still as scary as ever. Why it is so? Complex system requires complex programming methods and this goes against the spirit of the dictum "Good programs are easier to understand and develop". So can we conclude that system programming being done is bad programming. Or the programmers are not following best practices and all the elaborate methods created to ease programming. Or they are still dealing with a lot of legacy code. Looking at the output "they" are the best talent. Probably its the fact that they are good that they create something so brilliant that nobody can fathom or appreciate. The reality needs to be explored.

System Programming
So what exactly is operating system programming. Is it really something different from regular programming or its just a subset of programming. System Programming in contrast to application programming is directed towards hardware/system programming while Application Programming deals with creating software which interact with the user.

Sources of Complexity
The complexity in the dealing with systems programming language amongst others is due to the following reasons.

1. Legacy
2. Diversity

Legacy
Legacy code is very hard to get away with in application programming but when it comes to Systems Programming. Legacy code is EXTREMELY hard to get away. Since the legacy code has been written with an extremely large investment of time and also due to scarcity of expertise available so it is more feasible to use legacy code instead of re-inventing the wheel.

Diversity of Hardware
This is the other problem which leads to complicated code. Hardware diversity needs to be catered by system programmers as compared to application programmers who have the luxury of using interfaces (provided by system programmers) to hardware.

How to get over it
There is no short cut to success in the system programming domain. One has to go through the rigor of learning the issue. An easier way could be using the input-output model. Input output model is a way of getting reducing the code's complexity and focusing on inputs and outputs of the given algorithm/ function. This helps in first understanding the basic goals before diving deeper into the code.

Another familiar method is to develop state diagrams. This methodology requires a little more skill than the previous method but can be extremely helpful in figuring out complicated code.

## Thursday, May 06, 2010

### Harnessing UML in Ubuntu

Software development was never easy. Software design became extremely difficult with the explosive growth of computers and Internet. Let me accept that I was never a fan of UML. I always believed that software development is an art and the design process is more of an inspiration than a mechanical process of finding nouns and verbs. Patterns, well I thought they were probably the worst creativity-repellents possible. So I was always lousy with UML until I realized that this aversion was compounded due to using MS Visio. Visio probably is the worst place to go to develop UML diagrams. My faith in Visio was developed while using the minimal project management tools  (during undergraduate studies) so when it came to doing UML I simply went ahead and used Visio for Software Design.

Not until 2007 did I come out of the dark ages when I used umbrello for the first time. Then i came to realize that the true worth of a UML Drawing tool. The way in which all the diagrams were integrated gave a sense of connection. While in Visio it was always like drawing a redundant diagram be it class diagram or interaction diagram. However with  umbrello this is  not the case. The model that is defined in one diagram can be used in the other diagram. This functionality is emphasized by the code-generation facility. Although I personally did not like to use this facility but its a handy tool specially when dealing with very large projects. The fact that umbrello supports code-generation in many languages adds to the tool-box of the designer.

How to install umbrello in Ubuntu?
$sudo apt-get install umbrello Once installed the program is available in Applications--> Programming menu. ## Wednesday, April 07, 2010 ### Overcoming Python in Ubuntu 9.10 Python's rise to power is imminent by claims that Python alongside PHP/ Perl is the "P" in LAMP. This is equivalent to knighthood amongst geeks. Python fascinates developers who are very much conscious about productivity. The first step is to ensure Python is installed so here is the command to do so in Ubuntu 9.10. (Most likely it would already be installed).$ sudo apt-get install python
The first step to testing a language is to write a hello world program.

Hello World Program
Python's manner of saying hello world to the world is different. Go to terminal and write the following command to invoke python.
$python Once inside the shell. Type print "hello world" and the output hello world will appear. >>> print "hello world" hello world Another way of saying hello from the Python world is via creating a text file and writing the same in it and then saving it say "helloworld.py" and then on the command prompt executing the following statement.$ python helloworld.py
Or to write the following in the interactive shell. (Please note that .py is not written)
>>>import helloworld
The Next Step Forward
A very easy but powerful command can be used is called "subprocess". This helps in executing the shell commands. This simply executes the "df -h" command. The format is to first import the subprocess and then
>>> import subprocess
>>> subprocess.call(["df","-h"])
Filesystem            Size  Used Avail Use% Mounted on
/dev/sda1             1299.4G  5.0G  4.0G  56% /
udev                  490M  228K  490M   1% /dev
Functions in Python
Lets take a leap here and write a recursive function so that our imagination is tested and concepts build on sound footing. Write the following code in a text file and name it factorial.py
def fact(n):
if n==1:
return 1
else :
return fact(n-1)*n
Notice how similar it is to the factorial code written in any other C-lookalike language. The keyword def indicates the start of a function. Now execute the code in the interactive shell using the following commands.
Examples include finding 2!, 10! and 4! respectively
>>> import factorial

>>> factorial.fact(2)
2
>>> factorial.fact(10)
3628800
>>> factorial.fact(4)
24
Now lets write another function but this time its not recursive rather its iterative version of Fibonacci Serics. Write the following in a text file and save as fibonnaci.py. Notice how multiple assignments and increments are being performed.
def fib(n):
a, b = 0, 1
while b < n:
print b,
a,b = b, a+b
Execute the function in the interactive shell using the following commands.
>>> import fibonnaci
>>> fibonnaci.fib(10)
1 1 2 3 5 8
>>> fibonnaci.fib(100)
1 1 2 3 5 8 13 21 34 55 89
Object-Oriented Python
Python being a very clean language also helps one to define classes in a sleek manner. Write the following in a text file and save as PrintName.py
class NameClass :
myFirstName, myLastName = "Alexia", "Candella"
def printName(self):
print self.myFirstName
print self.myLastName
Now write the following in the interactive shell to test not only the class method but also to create the instance of the class.{Notice that PrintName is the name of the module loaded. Hence required to create instance of the class NameClass}
>>> import PrintName
>>> x=PrintName.NameClass()
>>> x.printName()
Alexia
Candella

Its believed that after doing this little exercise the python language has been overcome. Now the next part is to move on and do some serious stuff. This can be accomplished by using the tools of the language and combining to solve the problems and learning the language along the way. A very good reference is available here.

## Thursday, March 25, 2010

### Experimenting Java Enterprise Edition (EE) SDK 6 on Ubuntu Linux

Installations
Soon after downloading the Java EE on Ubuntu machine through the website the file (~48 MB) called java_ee_sdk-6-web-unix.sh. An attempt to run the shell script resulted in the following error.
Could not locate a suitable Java runtime.
and accessible in your PATH or by setting JAVA_HOME
This requires installation of Java 6 by using the following command (another roughly 55 MB).
$sudo apt-get install sun-java6-jdk Now going back to the Java EE installation through the following command.$ sh java_ee_sdk-6-web-unix.sh
Going through the clicks (one has to only write the password for admin account and the proxy if any). Rest of the information is filled in by default.

Verifying Installation
Now go to the web browser and type the following in the address box.
http://localhost:4848
If you changed the value of admin control panel port. Please replace 4848 with the port number you specified during installation. This would open the administration control panel (shown below) and demand user name and password. Please put the username/ password which were set during installation.

After the username and password has been entered the following window will appear.

Deploying Sample Application

Download the application hello.war from the website here. Click on Applications on the tree on the right side and click the button deploy and then point to the path where the hello.war was downloaded.

Then write the following in the address box of the web browser and write your name in the box and click submit.

http://localhost:8080/hello/

The following output will appear hence showing that the Java EE 6 has not only been installed but also working.

The next step is to develop applications using the Java EE standard. This requires knowledge of EJB, Components, three tier architecture and advanced java programming.

## Monday, March 22, 2010

### Virtualization :: Old wine in new bottles

Virtualization dates back to the introduction of CP/CMS later renamed IBM VM-370. VM-370 provided two key ingredients. Multiprogramming and extended machine with better interface with the hardware[1]. The recent interest in Virtualization came on the back drop of energy conservation and providing services to users. The explosive growth of the needs of internet and data centers has fueled this renewed interest in Virtualization.

Virtualization Software
Already have experimented with a virtual machine player VMware Player marketed by VMware Inc. Using this we installed FreeBSD, Ubuntu, Gentoo on top of Windows XP.

Whats the next step towards Virtualization? Hypervisor? Hypervisors are of two basic types. Type one are those which provide direct interface between the Guest OS and the hardware. While the type two hypervisor has a host OS onto which the virtualization software is installed and then the guest OS is installed. VMware ESX and GSX are examples of type 1 and type 2 hypervisors.

Benefits
So what exactly does Virtualization achieve? The increasing needs of users demanded setting up servers for individual applications. Email Server, Web Server, Internet Server, Print Server, File Server and many more servers are routinely installed and maintained by IT departments in large organizations. Now with the advent of superior hardware the needs of the users can be served by using only one machine with all these server applications running on it. This is generally not possible due to limitations/ requirements of using different OS for different applications. So its better to use hypervisors and install as many applications as possible over a set of hardware. This is not only energy efficient but due to hypervisors as easy to maintain as the each individual servers.

Drawbacks
The primary drawback is performance issues. Maintenance could also prop up as a problem (which it tended originally solved) but separate machines might be better from Maintenance. Another issue is security.

Genuine breakthrough or simply old wine in new bottles
Virtualization is one of the few ideas which came back from the dead and took the world by storm. Its importance is gaining due to another technology. Cloud Computing is getting more and more pervasive and this has helped virtualization gain more strength and appeal. Clouds of applications can be maintained by using virtualization like never before. Calling it old win in a new bottles will be a little bit too simplistic. Although on the face of it, virtualization does borrow from the good old VM/370 but with the advent of internet and cloud computing its given the old wine a very nice taste. Application like VMware vSphere for data center virtualization are changing the meaning of virtualization. Amazon's Elastic Cloud Compute (EC2) enables users to rent out computers to run applications using virtualization as a building block to create the EC2.  Grid Computing, an effort where (even geographically) diverse machines come together and try to solve a single task, is defined by Plaszczak/Wellner as "the technology that enables resource virtualization, on-demand provisioning, and service (resource) sharing between organizations."[3].

A very nice but pretty detailed and long introduction is available here.

References
[1] Tanenbaum, A. S., "Modern Operating Systems", Third Edition, 2008
[2] http://www.infoq.com/articles/virtualization-intro
[3] P Plaszczak, R Wellner, Grid computing, 2005, Elsevier/Morgan Kaufmann, San Francisco

## Monday, March 08, 2010

### How and where should I change proxy settings in Linux?

PLEASE NOTE THAT IF YOU ARE USING DIRECT CONNECTION TO THE INTERNET THEN THE FOLLOWING POST DOES NOT APPLY. TRYING TO SET THE SETTINGS MAY LEAD TO DISRUPTIONS / DISCONNECTION FROM THE INTERNET.

How and where should I change proxy settings in Linux? This is a most frequent question posed by newbie (and not so new) Linux users ? This mountain can be scaled in three steps.

## Step 1 Using proxy to get simply connected to the internet

Just change proxy setting in Firefox and/ or browser of your choice and start using internet. Ignore the rest of urges of Linux to get updates etc and life will be cool and simple.

Of course life cannot continue to be so simple. One has to after all install programs etc.

## Step 2 Adding proxy for system-wide settings

Now the next step is to setup proxy for "system-wide" setting. If using GNOME got to the System-> Preferences -> Network Proxy. Typical settings could be ip address/ host name of the proxy server and port name (typically 8080).  Click system-wide settings button and then authenticate by the root password. This would add system-wide system proxy settings and ability to use package managers and instant messengers.

## Step 3 Adding proxy to the terminal

Reading through the manuals/ howtos (including those on this blog) the sudo apt-get install (or yum or other variants of package install commands) frequently need proxy settings. The proxy settings can be setup via the following command sequence.

Edit your .bashrc file present in home directory in your favorite editor. Here vi is used for demonstration.
$vi .bashrc Go to the end and add the following please note # indicates comments its healthy practice to write comments so that later we know why we wrote the particular lines. # Proxy settings export http_proxy="http://MY_PROXY_IP_OR_HOST:MY_PROXY_PORT" export ftp_proxy="http://MY_PROXY_IP_OR_HOST:MY_PROXY_PORT" Please ensure to replace MY_PROXY_IP_OR_HOST by the proxy server ip or host name and MY_PROXY_PORT by port number (typically 8080). Close the terminal and start again. Test out by writing export in the command line and it should be followed by a long list and at the end you should find. declare -x http_proxy="http://MY_PROXY_IP_OR_HOST:MY_PROXY_PORT" declare -x ftp_proxy="http://MY_PROXY_IP_OR_HOST:MY_PROXY_PORT" {Please note the MY_PROXY_IP _OR_HOST and PROXY_PORT will be replaced by the ip/ host name and port number you specify earlier} A simple test whether the settings have been done is by the following command (if available). It should open the command line browser and open google's webpage.$ w3m www.google.com

Now the apt-get install and other commands requiring connection to internet via proxy should work.

## Monday, March 01, 2010

### Linux LAMP raising jinn and exorcising demons

In the early nineties Linux was a hobby, which in the words of Linus Torvalds might never be professional like GNU. It took off and shook the foundations of empires aspiring to bleed every penny by selling server operating system. The contributions of open source community have caused severe headache for software companies. Apache Foundation is one such name. How much does it cost to setup a web server? Around the cost of a LAMP, compare this with the cost of aptly named vamp or is it WAMP?  What jinn did the lamp raise? What changes did it trigger? A very simple question is how come Ethernet and Networks were around in the seventies but it took just a couple of years after Linux for Internet to raise it head? The answers to all these questions is there was no LAMP earlier.

LAMP stands for Linux Apache MySQL PHP (Python / Perl).  This is open source combination which gives a head start to web development environment. LAMPs have been around for not very long but the LAMP was developed gradually. First came Linux and then after couple of years it was Apache, followed by MySQL and PHP. Apache web server became the default choice because number one it was open source (and free) and secondly it integrated pretty well with Linux. Just imagine telling a windows user that the configurations of Apache can be done via httpd.conf file. Now first question asked by the window's user is ok http is understandable but what is the “d” and secondly what is .conf extension? So the integration of Apache into windows as a service is pretty much artificial and although it works ok but Linux & Apache go together.

Next comes MySQL. MySQLis open source (and free). So that makes it the choice for database and its whole organization and architecture seamlessly fits into the Linux frame of mind. Just imagine having web interface for database management and then using ports to access databases. Windows does not deal in ports (until very late).

Now PHP is a completely scripting language which looks somewhat like C. Again C and scripts rings  Linux/Unix bells. PHP created circa 1995 so were Apache and MySQL. Internet boomed and growth quadrupled ever since.

## Wednesday, February 24, 2010

### The LISP-end of the computing

LISP never introduced until very late in school and that too in the form of ONE slide (yes true) one slide. So what is it in LISP that deserves so little concern. Looking at MIT OCW lectures it seems to be THE language of choice. So whats lisp infact? LISP (LISt Processing) is a language where the primary data structure is list and modus operandi is list operations.

Step 1:  Installation
Installation of lisp on aptitude based package manager (like Debian/ Ubuntu) can be achieved by the following command.
$sudo apt-get install clisp Alongside this its recommended to install the HTML version of the book Common Lisp the language, 2nd Edition by Guy L. Steele Jr.$ sudo apt-get install cltl
The book is installed in the /usr/share/doc/cltl/clm/index.html.

Step 2: Testing LISP functionality
Soon after installation clisp can be tested for functionality. The following sequence of commands test the lisp environment.
To start clisp
$clisp Try out the lisp-expression to add 2 and 2 the result is in the second line. [1]> (+ 2 2) 4 Build on more complex lists. The following expression is lisp way of (1*2)+ 6. [2]> (+ (* 1 2) 6) 8 Now with the increase in comfort and confidence that lisp is installed and functional. Assigning values to variables is done via the following command [3]> (setq a 10) 10 For an extremely simple loop using the print command to print hello world five times. [4]> (do ((x 1 (+ x 1))) ((> x 5)) (print "hello world")) "hello world" "hello world" "hello world" "hello world" "hello world" NIL Step 3: Doing something more than "hello world" Having done with assignments and control structures. The next thing is procedures / functions. LISP supports functions via the keyword defun. [5]> (defun increment (x) (+ x 1)) INCREMENT [6]> (increment 10) 11 The line 5 defines the function named increment which essentially has one argument and one line in which the argument is incremented by one. The line 6 calls the function increment with the argument passed as second member of the list. The following code finds the square of a variable x. (defun square(x) (x*x)) (setq x 10) (square x) Exploration of LISP on emacs is another really interesting topic which needs a dedicated post. ### Why hibernating Linux might be a great idea Why hibernating Linux might be a great idea? What supports the idea? There are basically two reasons that support this idea. One Linux has traditionally lived up to the billing as premium server class operating system. Secondly it might strange but Windows has performed much better on Laptop using hibernate option than desktops. The former is a direct evidence support the convention while latter is circumstantial evidence. Since Windows is performing better in spite of it’s not so glorious record otherwise. So Linux can be expected to perform better than Windows because of its better history. The conjecture that server’s better performance can be attributed to the lesser frequent restarts needs to be further investigated but for now we could assume it to be true. Therefore its important to start setting up the hibernate option in the Operating System (be it Windows or Linux on laptop or a desktop). Its very good for the battery life too. Steps for initiating hibernate inUbuntu Linux (Karmic 9.10) Ubuntu click on your user name on the task bar and select "Hibernate" from the menu. Steps for initiating hibernate in Windows XP Operating System the option can be accessed by clicking Control Panel --> Power Options --> Selecting the tab Hibernate. Click the box next to Enable Hibernate (it would be checked by a tick mark). Next go to Advanced Tab and select actions to perform when the lid of the portable computer is closed , Power button is pressed or standby button is pressed. ## Thursday, February 18, 2010 ### The Evolution that is Linux The availability of huge number of Linux distributions is amazing. The primordial soup developed by Dennis Ritchie and Ken Thompson named UNIX had in it genesis the most amazing OS Multics. The primordial soup gave birth to a phenomenon which resulted in evolution of one beautiful species after another. The rise of Lisa led to Mac. Even the great evil empire acknowledged its superiority by first trying to develop Xenix and when it failed in its attempt to usurp started slowly building in features into its own Windows. A mutation took place in 1991 when Linux Travolds gave birth to Linux. This led to the never ending stream of species. Each and every species had its strengths and weaknesses but every species showed the brilliance and resilience that was present in its initial genes. The Linux of today cannot be compared directly to the MULTICS or even the first UNIX but the design is same providing solid and unbeatable performance. Not only is Linux and FreeBSD holding on to the server markets but also under the hood of iPhones and ARM processors. Spreading its seeds across all continents be it the PC Market or Laptop or Mini-Laptop Market, the species continues to awe its original developers. ## Thursday, February 11, 2010 ### Linux Newbie : Why I cannot use Latex like those gurus? Latex is another brilliant Linux / Unix specialty hidden behind the geek facade. Interesting introduction here {warning PDF file} talks in detail about why and how document processing improves with increase in document complexity. In the words of the author "It is very hard to write unstructured and disorganized documents". I like the approach but feel that this might attract power users but for newbies its more of the same old school of Linux is for geeks. The purpose is not to criticize the authors. He / she did a great job and deserves credit for this article as well. Now the primary question is how and where to start, following tutorials almost always leads to temporary use. Even students writing thesis in latex only a few would later use it for submitting assignments or other work. The problem is two fold one the students or users are not introduced to latex in a friendly manner rather they use it as if trying to live up to the 'lofty standards' setup by advisers or seniors. The second problem was highlighted when a highly motivated student submitted a journal paper in pdf format (developed via pdflatex) to the supervisor. The supervisor demanded an MS-Word Copy of the paper so as to edit. The student completely disgruntled vowed never to use Latex again. Therefore a complete change of heart is required to accept Latex. Once this step has been done getting to grips with Latex becomes easier and fun. Step one: Getting Latex The following method is for Ubuntu. Please google/ contact service manual on how to install on other distributions or here for steps. Ubuntu / Debian :$ apt-get install texlive

Step two: Starting up
Its never an easy job getting used to something as complex and with such rich history like tex. Using a gui-based tex would be a perfect way to not getting to use tex later. Since in that scenario it is incorrectly compared to microsoft and open office. Yes those programs are good to get work done but they are not anything like tex. Why? This would become clear soon.
Write the following text in a file using editor of your choice (freedom at last).
\documentclass{article}
\title{Welcome to Freedom}
\begin{document}
\maketitle
This is a sample document which showcases the freedom and peace of mind offered by using {LATEX}
\end{document}
The following command creates an output file (pdf format) with same name as the name of the tex file.
$pdflatex firstTry.tex So here lies the first charm of latex that my readers dont need to buy expensive EDITING software just to read what i write. How did latex decide about the placement of the title and date and fonts. The key to success lies in the fact that the document design and text has been separated. Hence now the user can concentrate on producing text rather than deciding about placement of text, font sizes and other details that confound a user trying to use a word processor. This fact can be readily appreciated by a simple change in the above code. Change \documentclass{article} to \documentclass{report} or \documentclass{book}. The resulting pdf file exhibits that the formating of the document can be changed at any stage and the user is least bothered how the document looks while typing the document. This is immense relief. This bare-minimum tex file ONLY demonstrates the principle of separation of formatting and content. Step three: Adding more functionality Now that fear of tags has subsided. The following tags add more formating information. \section {SECTION_HEADING_HERE} \subsection{SUB_SECTION HEADING_HERE} \subsubsection{SUB_SUB_SECTION_HEADING_HERE} Incorporate the above tags in the file as below. \documentclass{article} \title{Welcome to Freedom} \begin{document} \maketitle This is a sample document which showcases the freedom and peace of mind offered by using \LaTeX \section{Formating Sections} The best part is that you dont have to set font sizes and then try to place the section headings \subsection{Sub Sections are cool} Adding simple tag of subsection simply adds more sections and special part is there is no numbering involved. The numbering is dealt with automatically at the time of preparing pdf file. \subsubsection{Moving further inside we can enjoy better functionality} This shows how easy it is create subsections within subsections. Simply continue to add sub behind the tag and it goes deeper and deeper. \section{Simplicity is still the essence} So the current document if shown to a novice will scare away but we are begining to get the feel of Latex. Rest would be mostly easier part \end{document} The following screen shot exhibits the usage of the above tags. Adding \tableofcontents after \maketitle adds table of contents automatically. Step four: Moving on to power usage Moving on to power usage requires the approach according to the Chinese proverb. Give a man a fish and you feed him for a day. Teach a man to fish and you feed him for a lifetime. Instead of going into the details of each and every command/ tag. Its far better to use the "i will learn it when I need it" approach. This can only be achieved by keeping a tap on where to get the desired information. One method is to use "The (Not So) Short Introduction to LATEX 2E" as a reference and simply copy and use any commands (examples) and tweak to requirement. For Example , In order to add mathematical equations and use them effectively the best manner is to actually go and check out an example in the tutorial and study them. The following code fragment adds the equations given in the screen shot that follows. \section{Explaining how to write Mathematical Equations} One simple way to write mathematical equations is$A_x^2$or using the tags begin equation and end equation \sum_{i=0}^{\infty} X_i=0 \omega = \Omega \sigma = \Sigma The code fragments shows in order to write subscript _ is used while to write as superscript ^ is used. Also another thing to notice is that \sigma outputs lower case Greek alphabet sigma while \Sigma outputs an upper case Greek alphabet sigma. Another method more in line with the Chinese proverb is the use of the template files provided by the publishers. E.g., IEEE conference style file (IEEEconf.sty) comes with a pdf which explains all the features and how to use IEEEconf.sty. Similarly University of California Los Angeles (UCLA) thesis style file comes with a readme file which explains how to use it. This way by maintaining high level of confidence and keeping learning Latex while ACTUALLY using it to accomplish different tasks leads to better understanding and long term use. ## Tuesday, February 09, 2010 ### How to automatically change proxy setting in Firefox ? Proxy setting for office LAN / WLAN and home LAN / WLAN are often different. Firefox provides an easy method of changing proxies without having to do anything. The file proxy.pac provides an easy method of automatically changing proxy setting in Firefox from one network to another. There are elaborate methods of changing proxy settings given here. However the easiest method is described below. Step 1: Edit the file proxy.pac (preferably save it at a convenient location) and add the following line. function FindProxyForURL(url, host)  {   return "PROXY proxy.example.com:8080; DIRECT";  } Where proxy.example.com is simply the address of proxy and could be replaced by the IP Address of the proxy server. 8080 is the port address for http in most networks but could be different in some networks. Step 2: Go to Firefox Menu--> Tools --> Options --> Network Tab --> Click Setting Button --> Check the automatic proxy configuration url and enter "file:///your_path_goes_here/proxy.pac" in it. Please ensure to replace the your_path_goes_here with the path where the file proxy.pac is saved. For example in my case it was file:///home/zahid/proxy.pac Now the Firefox will first check the proxy and if it is not responding then connect directly to the internet. This removes the head ache of changing proxy settings on an almost daily basis. Peace at last !! For those still using Internet Explorer ::: Please keep changing your proxy at regular intervals :) ## Wednesday, February 03, 2010 ### Linux Newbie : Why grep almost never yields something productive Getting Started Level 0 Every Linux newbie hears about power of grep sooner or later. But no sooner does newbie try to use the grep command the experiment ends badly. The reason grep almost never yields anything productive is because of a couple of issues. The basic problem is lack of knowledge of regular expressions and secondly knowledge of switches of grep. This realization should not deter a new user from using grep. A very nice detailed tutorial of regular expression (regex) is available here. The migrating user having used dir and similar commands in DOS and using windows search box almost never expects (nor can fathom) the power provided by grep. Firstly grep means "global regular expression print". Now this means that knowledge of regex is required to be able to use grep effectively. Now what should a newbie do? Wait till user's regex knowledge improves? NO NOT AT ALL. That would scare away the user and never to use grep. Here is the first simple command that would work to display the files in the current directory. grep -l ".*" * REMEMBER this only lists the files (not directories). Now this is not why grep is used but simply given to enhance confidence of the user that "yes we can" use grep. The command ls can be used to list contents of current working directory. The command grep is used to do something more productive. Now lets start doing something ls might not do. Lets try to find all the files with the word "include" IN them. The following simple command does this simply. grep -l "include" * To search all the subdirectories add r to the options which means to search recursively. grep -lr "include" * This is all one actually needs to start using grep a little bit more effectively. ## Jumping to Level 1 Now to be more productive one has to start using regular expressions. A step by step approach would be better than first going to a regular expressions tutorial and then coming back totally lost. The following can serve as the first command which uses regular expression. The average user might not even sense any difference because primarily its the same command as above but adds huge difference with slight modification. Now to check for files which have either trap or drap add t and d in brackets before rap and then search. The following command searches for all the files which have either trap or drap. grep -l "[td]rap" * ## Jumping to Level 2 This simple addition has enhanced our power of using grep. Another addition to our power is provided by two characters "^" and "$". Now the following command searches files which contain words starting with alphabet a.
grep -l "^a" *
And the following command simply searches words which end with the alphabet d.
grep -l "d$" * Now in combination the following command can be used to find the files which contain words starting with alphabet a and ending with the alphabet d. grep -l "^a.*d$" *
This was accomplished by simply adding the ".*" between the two alphabets ensures that anything in between the start and end "pattern".

## Jumping to Level 3

Now going back to the brackets discussed earlier. Multiple characters can be added to search by simply adding them in brackets. E.g, To search words starting with either a, e, i, o or u. The following command can be used.
grep -l "^[aeiou]" *
This command searches for all the words that start with a vowel i.e, a,e,i,o or u. This can be used in combination by using commands similar to the following where the files listed contain words that start with a or b and end with d or s.
grep -l "^[ab].*[ds]$" * Now the expression in brackets can be improved by using ranges and groups. A-Z match all the upper case alphabets a-z match all the lower case alphabets 0-9 matches all the digits Now using the above ranges the following command tries to search files with words which start with an alphabet and end with a digit. Combining A-Za-z ensures that the starting alphabet could either be upper case or lower case. grep -l "^[A-Za-z].*[0-9]$"
The above discussion is enough for a Linux newbie to appreciate the usage of grep command and be comfortable at different levels. In order to attain more power in using grep consult man / info pages of grep or any advanced grep or regex tutorial over the Internet.

References
1. Linux Manual grep

## Friday, January 29, 2010

### The VNC Alternative of using Linux over Windows

A popular method of connecting to Linux Machine is via SSH (Secure Shell). This protocol of exchanging data over secure channel [1]. The beauty of this channel is that it could easily be extended to provide other facilities like SFTP (Secure File Transfer Protocol) or SCP (Secure Copy). One could get hold to a host of small applications from www.ssh.com [2]. Another method to use is VNC (Virtual Network Computing) [3]. Its a desktop sharing method and uses remote frame buffer protocol.

The VNC alternative to using Linux over Windows can be used to enjoy Linux while still using Windows distribution. The only constraint is that one has to use an extra machine for Linux but the fact that we don't need to use any monitor, keyboard or mouse with it helps in both ease of use as well as house keeping (tuck away the Linux box under any table in your office and no one will know it exists).  The key to this idea are two simple commands on the two operating systems.

#### Linux Box

Step 1: Run vncserver command and  then vncpasswd command to set password for entry. The first command will give you "New 'X' desktop is HOST_NAME:DISPLAY_NUMBER".

#### Windows

Step 2: Install vncviewer from the website (http://www.realvnc.com/cgi-bin/download.cgi). Then run the vncviewer and enter the information HOST_IP_ADDRESS:DISPLAY_NUMBER.  Clicking ok will prompt for password. Enter the password set using vncpasswd command.
And boom ... one is able to use Linux X-Windows Graphical Interface over Windows Machine.

#### References

[1] http://en.wikipedia.org/wiki/Secure_Shell
[2] www.ssh.com
[3] http://en.wikipedia.org/wiki/Virtual_Network_Computing

### Two easy steps to installing Firefox 3.6 in Ubuntu 9.10

Browsing through google results for installing the latest release of Firefox 3.6 and Ubuntu 9.10 seems to be a headache. While it is extremely simple to install. Just go to the www.firefox.com and click install. Go through the motions of downloading and clicking the downloaded archive tar.

Simple is not it !

## Wednesday, January 27, 2010

### Two steps to using Assembly in Linux (ubuntu 9.10)

Most programmers shy away from assembly language. The reason is not that they don't have a solid background of assembly but that they don't know how to write code outside the simulators etc that they are familiar. Especially when it comes using assembly under linux.

It comprises two steps.

#### 1. Compiling with assembler

The assembly language compiler in linux is as. The code to compile a program say helloWorld.s is the following
$as helloWorld.s -o helloWorld.o #### 2. Linking and Executing the code The output of assembling is object code which needs linking. The code for linking the object file "helloWorld.o".$ ld -helloWorld.o -o  helloWorld.out
To execute the code we need to do the following.
$./helloWorld.out The code for helloWorld.s is attached here and thanks to reference 1. .text .global _start _start: movl$len,%edx       # third argument: message length
movl    $msg,%ecx # second argument: pointer to message to write movl$1,%ebx         # first argument: file handle (stdout)
movl    $4,%eax # system call number (sys_write) int$0x80           # call kernel

# and exit

movl    $0,%ebx # first argument: exit code movl$1,%eax         # system call number (sys_exit)
int     $0x80 # call kernel .data # section declaration msg: .ascii "Hello, world!\n" # our dear string len = . - msg # length of our dear string References : 1. http://asm.sourceforge.net/howto/hello.html ### Trial of Ruby in Ubuntu 9.10 Ruby the pinnacle of OOP a language described as more object oriented than Python and more powerful than Perl by its creator Yukihiro Maksumoto. I set out to test it on my Ubuntu 9.10. Given ubuntu's excellent support and repositiories no wonder not only ruby but also ruby on rails was available in the respositories. I just had to shoot out the following commands. ### Installation$ sudo apt-get install ruby-full build-essentials
$sudo apt-get install rails$ sudo apt-get install vim-ruby
$sudo apt-get install vim-rails The same could be accomplished via the synaptic package manager in a more graphical environment by selecting System->Administration->Synaptic Package Manager. Then searching for the packages ruby and rails and selecting both and clicking "Apply". Soon after installation I went ahead and wrote the following program to test the ruby language. The benefit of installing vim-ruby and vim-rails is that it helps in using vi with syntax highlighting for ruby and rails. testString="aaabbbbc" if testString=~/^a*b*c$/
print testString, " is in the language a*b*c\n"
else
print testString, "is not in the language a*b*c\n"
end
testString="aaaaacb"
if testString=~/^a*b*c$/ print testString, " is in the language a*b*c\n" else print testString, "is not in the language a*b*c\n" end This completed the test of ruby in Ubuntu. More about ruby over rails later. ## Tuesday, January 26, 2010 ### Upgrading Ubuntu from Hardy (8.04) to Karmic Koala (9.10) via Intrepid Ibex (8.10) and Jaunty Jackalope (9.04) Realized my Hardy (8.04) on the desktop needs upgrading finally. Even though I had decided not to upgrade it until an LTS (Long Term Support) Ubuntu is released. Upgrading from Hardy (8.04) to Karmic Koala (9.10) is not directly supported so the logical path is to do the following upgrades. ## Step 1. Hardy (8.04) to Intrepid Ibex (8.10) ## Step 2. Intrepid Ibex (8.10) to Jaunty Jackalope (9.04) ## Step 3. Jaunty Jackalope (9.04) to Karmic Koala (9.10) Each step follows similar pattern.$ sudo do-release-upgrade

The same could be done via the graphical user interface (GUI) by selecting System ->Administration -> Software Sources , select Updates Tab and from release upgrade combo select  "Normal Releases". Then select System ->Administration -> Update Manager.

This is followed by countless mega bytes of download and hope that net keeps connected all along.

## Monday, January 25, 2010

### Desirable features in a text editor in Linux or Windows

The selection of editor and choice of editors is something Windows users cannot imagine let alone comprehend. Windows comes with at most two text editors and one word processor. Notepad and Wordpad come under the heading of text editors. Wordpad is quite useful in this respect and notepad is very handy. However if one wishes to do any advanced word processing one has to shift to Microsoft Word. No doubt various text editors like TextPad, Emacs for Windows , Vi for windows are available to download.

Linux flavors be it FreeBSD, Ubuntu, Solaris, Red Hat,Gentoo or countless others come with host of editors for text editing. What are the features one loves to have in a text editor
• F-01. EASE of use
• F-02. EASE of scrolling/ searching text
• F-03. EASE of editing text
• F-04 Language syntax highlighting / command menu
F-01 EASE of use
The best for satisfying F-01 is nano. The reason is that most of the command are very easy to remember moreover the commands mostly needed are continuously written at the bottom of the screen. More commands can be accessed via the Ctrl-G Help Menu.

F-02 Ease of scrolling/ searching text
Nearly all the editors support scrolling up and down via arrow keys which comes in handy. Page Up and Page Down is also very convenient for scrolling. This feature bring them all to an equal level. Search feature of vi is extremely effective. the only thing needed to be done is to go to the command mode and then write "/SEARCH_TEXT_GOES_HERE". Replace the SEARCH_TEXT_GOES_HERE with text to search and the cursor will go to the first occurence of the search text and then by simply input of "/" one can cycle through the rest of the occurrences. The "/" also provides elaborate mechanism for search and replace.

F-03 Ease of editing text
Basic editing text in all the editors more or less the same. Efficiency can only be improved by learning a couple of new commands. Here nano again takes the lead by providing the text editing commands at the bottom at all times.

F-04 Language syntax highlighting / command menu
Language highlighting is a feature which is altogether missing on Windows platform. The platform provides language syntax highlighting feature in its Visual Studio product line. That too for a particular range of product. While the Linux editors (nano being an exception) provide language syntax highlighting which comes in handy. The best part is that new language syntax are being continuously added. Some editors also provide features of integrating compilation with gcc and/ or other compilers.

The features list is not comprehensive but for the needs of an average user these are sufficient to be able to select an editor. In the end the jury is still out for the best editor but given user preferences / usage the best editor can be shortlisted via the above feature lists.

## Sunday, January 24, 2010

### Experiment :: FreeBSD on VMware Player on top of XP

Installing FreeBSD on top of XP via VMware Player. The experience is pretty nice until now. Installation steps were pretty simple.

2. Then installed the ISO via VMware Player

The first reboot led me to a prompt \$ and ability to run various commands. The next step is to install KDE. Started off installing KDE by executing the following command (make install clean) on the command line in the directory (/usr/ports/x11/kde4/). this led to downloading the kde from internet. Then since no proxy was setup therefore an error was encountered. This was solved by setenv http_proxy http://myproxyaddress:8080 .

The command line came up easily the only problem was that KDE failed to install and gave the error "undefined reference to vtable projectbuildermakefilegenerator". Trying to figure out what to do with this error. Tried to google, rebuild ports, reload ports and lets see what happens.

But overall again a nice experiment.

Later I tried pkg_add -r kde4

## Friday, January 22, 2010

### Why nano is still better than Vi? Pine or not ?

One of the classic debates amongst linux geeks is whether Vi is better than emacs and vice versa. I have used all three and personally I like nano. The reason is not functionality richness or any other performance metric but simply the fact that I had been using PICO (Pine Composer) for a long long time roughly five years. Pico was first developed in University of Washington and came along side the famous email program pine (Program for Internet News and Email). The skills developed while using pine to quickly check email and reply via pico have been instrumental in developing this skill set. I performed my data structures, operating systems and networking lab coding in the same pico editor and this lead to my constant hookup with pico. GNU's clone for pico is nano and this does not let me use vi and /or emacs. Even though vi is supremely more effective and productive than nano but history is on nano's side.

## Thursday, January 21, 2010

### Operating Systems Books Review

Last Summer I taught Data Structures and found it to be hardest subject to teach. Before teaching data structures I had either come across freshmen or handled mature students of final year or second half of third year. That was the first time I encountered students of second year (of course I teach Discrete Maths to second year students but that's completely new course) and had to teach them a course which strongly relied on concepts built in the previous course. Artificial Intelligence and elective courses mostly have their own pace and agenda therefore not hard to teach.

I am going to teach Operating Systems for the second time. Teaching Operating Systems from Tanenbaum's Modern Operating Systems looks like a cake walk for the teacher but a bed of thorns for the students. The question is how to make it further interesting? The alternate is to use Tanenbaum's book Operating Systems Design and Implementation. This would be complimented by MINIX OS developed by Tanenbaum.

Operating System Concepts by Silberschtaz & Galvin is also a nice review of Operating Systems Concepts and is a famous book with lecturers. A good reference book that I encountered is Solaris Internals by Jim Mauro & Richard Mc Dougall. I used it as reference book while teaching Computer Architecture in undergraduate classes. Part One is description of Kernel Primitives, Services. Part Two of the book elaborates the Solaris Memory System. The third and fourth part deal with comparatively hard subjects of Threads, Process and IPC and Files and File System. Overall the book is designed to be used to understand the underlying architecture of Solaris.

The most important question that an instructor needs to cater is the overlap between Operating Systems, Distributed Systems and to some extent Computer Networks. This is left as a topic for later post. However it sure is the dilemma of instructors at the end of the course while selecting topics.

## Wednesday, January 20, 2010

### VMware Player Gateway to Linux Experience

Recently I got a recommendation to use VMware Player to install linux. Ever since it has grown on me. I have install Ubuntu 9.10, edUbuntu, Ubuntu 7.10 and Gentoo. Especially the way it dynamically uses memory instead of reserving bulk of memory space. If i were to actually partition the hard disk and install all these OS I would have had to give up somewhere between 3GB ~ 30 GB. But now I am only using 11 GB of space. Apart from this benefit another amazing feature is that a separate IP address is available (even when connecting to a DHCP Server). This is not only keeps it clean and simple but also allows interaction with the guest OS via network interface. Another interesting feature is the automatic restoration of the virtual machine state. This feature comes especially handy when dealing with power failures or unintentional closing of the window.

The installation steps are pretty much simple.
1. Install VMware Player from the website (http://www.vmware.com/products/player/).
2. Download the ISO and then use it. If you don't have the ISO use the CD-ROM and then either build an ISO from C/DVD using Roxio or similar software or directly install from the CD/DVD.
3. Follow steps on the screen just like you would in usual installation.