Monday, June 11, 2007

Its fun to make pages with YUI

Lately I have been experimenting with Yahoo User Interface (YUI) library. Its a library for making web interfaces, in javascript / ajax. It has controls similar to win32 common controls like treeView, dataTable, dataGrid, button, menu, etc. This library makes "web page making" so easy. It also has connection manager, xml parser, event handler, etc. to make life easy with ajax kind of scripting. It provides standard CSS files which allows you to make similar looking pages irrespective of the browser: Reset CSS, Font CSS, Grid CSS. I haven't tried much of the CSS files, though.
I made an FTP browser using YUI. Ceck this out. Obviously, there is more to it then just YUI. The server side scripting for it is part of what I was making for FTP Search. But it seems that I don't have time for it now :)

Saturday, May 12, 2007

Finding out available FTP servers on LAN : Part 2

I talked about finding ftp servers on a LAN using nmap in a previous post. Well, looks like I got a better alternative to nmap. And what is it? Well its no great software.. its just a small 50 lines program that attempts to connect on port 21 to PCs on LAN. If you can connect, then this PC has an FTP server hosted on it... add it to the list. I takes about 1.5 minutes on my network to scan 8960 IPs.
I was surprised how such a small and simple program could outperform a giant GNU tool like nmap. Or may be I could not fine-tune nmap to my needs. Anyway, this is the exact fine-tuned nmap command I was using that takes about 3.7 minutes to scan a list of 8960 IPs:-

$ nmap -n -P0 -p 21 --max_rtt_timeout 100 --max_retries 0 -oG ftpLog.gnmap -iL IPs.txt

I also tried using min_parallelism option and min_hostgroup option but they didn't provide any speedup. May be some nmap expert could comment on it. One more thing to be said is that if remove -P0 option or in other words enable pinging .. it takes only 1.5 mins! A great improvement.. but it misses out some FTP servers(those whose firewall disable ping scanning). So I had no resort but to use my own program.
This is the program in case someone is interested:-

/*
* NetworkScanner.c
*
* A fast network scanner that scans for a specified open port.
*
* It is an enhanced version of propecia.c created by Troy Robinson
* Created : 02/05/2007
* Author : Sandeep Kumar aka Turbo : http://students.iiit.ac.in/~sandeep_kr
*
* Usage: ./a.out
* Examples of Ip in ipfile:-
* 172.16-31.*.*
* 192.168.36.200
* 172.*.*.*
* 172.16.0-255.0-255
*
* Sample Usage: ./a.out 21 255 IP_List.txt Out.txt
*
*/

#include
#include
#include
#include
#include
#include
#include
int ParseIPPart(char *ipPart, int n, int *s, int *e, int startCtr, int dashCtr);
int ParseIP(char *ip, int s[4], int e[4]);
int main (int argc, char *argv[])
{
if (argc <> \n", argv[0]);
printf("Examples of Ip in ipfile:-\n172.16-31.*.*\n192.168.36.200\n172.*.*.*\n172.16.0-255.0-255\n");
printf("Sample Usage: %s 21 255 IP_List.txt Out.txt\n",argv[0]);
exit(1);
}
int port = atoi(argv[1]); // Port to scan
int parallelLimit = atoi(argv[2]); // Max no. parallel processes (using fork)
char *fname = argv[3]; // input filename
char *outFname = argv[4]; // output filename
int procCtr = 0; // current count of parallel propcesses
char ip[20]; // scanned ip string from input file
int s[4],e[4]; //ip range as parsed from scanned ip string ip.
char host[16];
int i1,i2,i3,i4;
int sockfd, result;
struct sockaddr_in address;
FILE *fp = fopen(fname,"r");
if(fp==NULL)
{
perror("fopen");
exit(0);
}
FILE *fw = fopen(outFname,"w");
if(fw==NULL)
{
perror("fopen");
exit(0);
}
while(fscanf(fp," %s",ip)!=EOF)
{
if(ParseIP(ip,s,e)==-1)
{
fprintf(stderr,"Invalid IP Range: %s\n",ip);
continue;
}
//printf("Range is:%d-%d.%d-%d.%d-%d.%d-%d\n",s[0],e[0],s[1],e[1],s[2],e[2],s[3],e[3]);
for(i1=s[0];i1<=e[0];i1++)for(i2=s[1];i2<=e[1];i2++)for(i3=s[2];i3<=e[2];i3++)for(i4=s[3];i4<=e[3];i4++) { sprintf (host, "%d.%d.%d.%d", i1,i2,i3,i4); if(procCtr>=parallelLimit)
{
wait(NULL);
procCtr--;
}
int childPid = fork();
if(childPid>=0) // fork succeeded
{
if (childPid == 0) //child
{
address.sin_family = AF_INET;
address.sin_port = htons (port);
address.sin_addr.s_addr = inet_addr (host);
sockfd = socket (AF_INET, SOCK_STREAM, 0);
if (sockfd < result =" connect" result ="="">0)
{
wait(NULL);
procCtr--;
}
close (sockfd);
exit(0);
}
// Parses one part of an IP. An IP has 4 parts separated by dots.
int ParseIPPart(char *ipPart, int n, int *s, int *e, int starCtr, int dashCtr)
{
int i,j;
if(starCtr) // For *
{
if(n!=1)return -1;
*s=0;*e=255;return 0;
}
if(dashCtr==0) // For normal number without star or dash
{
if(n==0||n>3)return -1;
*s=0;for(j=0;j255)return -1;
*e=*s;
return 0;
}
else if(dashCtr==1) // For dash
{
for(i=0;i3||n-1-i>3)return -1;
*s=0;for(j=0;j255)return -1;
*e=0;for(j=i+1;j255)return -1;
return 0;
}
else if(dashCtr>1) return -1;
}
// Parses an IP into the IP format: a-b.c-d.e-f.g-h
int ParseIP(char *ip, int s[4], int e[4])
{
int last=0,partCtr=0,starCtr=0,dashCtr=0,i;
for(i=0;1;i++)
{
if(ip[i]=='.'||ip[i]=='\0')
{
if(last==i || starCtr+dashCtr>1)return -1;
if(ParseIPPart(ip+last,i-last,&s[partCtr],&e[partCtr],starCtr,dashCtr)==-1)return -1;
starCtr=0,dashCtr=0;
last = i+1;
partCtr++;
if(ip[i]=='\0')break;
if(partCtr==4)return -1;
}
else if(isdigit(ip[i]));
else if(ip[i]=='*')starCtr++;
else if(ip[i]=='-')dashCtr++;
else return-1;
}
if(partCtr!=4)return -1;
return 0;
}

This could be used for scanning any port, not just FTP port(21). The comments in the code says all about the program.

PS: I wonder why Blogspot does not provide code tags. Users have asked for it since ages. But Blogger just ignores us all. Need to move to Wordpress.

FuckProxy vs IIITLANBrowser

Just now got a scrap on orkut from Ankur Khare thanking me for FuckProxy. I am always delighted to hear such compliments. But I noticed he was using it from his home. I was surprised how could FP work from outside campus as I had specifically disabled it for requests from outside IIIT on Monga's request. Actually it has a bug and it works from everywhere. Need to fix that.

FP is not meant to be used from outside. If you are outside, use IIITLANBrowser. It provides a login mechanism so that only IIITians can use it. Also, FuckProxy url redirect mechanism is set for inside campus. I guess people manually change urls to view pages through FP. Why to take that pain when you have an option?

Sunday, April 29, 2007

Finding out available FTP servers on LAN

Its summer vacation now and I have again started doing bc work (as I call it), though I am not so free this vacation :) Shwetabh and I are trying to build a FTP Search which will allow you to find out movies, etc. available on FTP servers on our LAN. I know there is already a similar site built by Paresh Jain on 150/media. But (SEEMS TO ME that) it only scans a list of manually specified servers, which were specified long long back. [Update: The server list is PROGRAMATICALLY updated once in a while (around a month)]. Other thing is that, its interface does not allow a proper search (although you can of course use ctrl+f). Also, we wish to provide a DC++ like interface for browsing FTP servers. And not to forget, having something showable is a plus point during job interviews.

So in the process of searching about how to best do this, I came across nmap - the network security tool written by Fyodor. It has lots and lots of options and the corresponding uses... quite a good tool.. awesome. This is how I get a list of available FTP servers at any point of time:-

// basic command. can be optimized.
$ nmap -p 21 -iL IPs.txt -oG ftpLog.gnmap ; grep open ftpLog.gnmap | cut -d ' ' -f 2

where IPs.txt contains the range of IPs to look for. This is my IPs.txt :-
172.16.1-16.*
172.16.18-20.*
172.16.22.*
172.16.24-32.*
172.17.0.*
172.17.8-10.*
172.17.16.*
192.168.36.*
[ Do tell me if I have missed any IP ]


I adjusted the timings, etc. to find the best configuration for scanning our LAN. Actually, I am still tinkering with it and will post the exact command later. Any suggestions for improving the performance? At present it takes around 5 mins.
Part 2.

Thursday, January 11, 2007

Welcome to the launch of IIIT LAN Browser 1.0 and FuckProxy 0.2

So all my readers( thats only me, I guess :P But isn't this blog meant for me only?) , I did some good work in this winter vacation. I released FuckProxy 0.2, the upgrade for the old 0.1 and made a new firefox extension IIIT LAN Browser. See my post here that kind of advertises it :D

This is the change log for FuckProxy 0.2:
1. Fixed a few bugs.
2. Added better support for ftp.
3. Added auto-update feaure.
4. Changed the position of FP Button as the previous position was interfering with the toolbars of delicious,etc.
5. Now typing "http" in the address bar is not a compulsion.

IIIT LAN Browser is an extension for the LAN Browser CGI proxy I had earlier. It provides the same functionality but much more easier to use now. Apart from that its much much more robust. The earlier version used to throw errors on half of the sites because of bad parsing. But this one does not parses at all. It directly uses the DOM Tree generated by firefox! So no problem at all. I guess no CGI proxy in world would have as robust parsing as this one.(Well if I dont't parse at all and use Firefox's work, then obviously it will be robust :P ).
I also added login functionality so that only iiitians can use it. You need to enter your 200 login and password to use it. Have fun...

PS: I have been always saying "PHP sucks, Python rocks". But looks like its not so true. However good python may be, but when it comes to making sites, PHP has the greatest no. of functionalities available. Python does not even have basic Session support!

How to refresh a page that is not loaded?

First of let us understand the heading of my post ;) If a page is already loaded in the browser, there are several methods to refresh it. You will find them all over the internet.
For example, you can use:-

  1. <equiv="refresh" content="5">
  2. location.reload(true)
  3. javascript:history.go(0)
  4. etc.
However, consider a different scenario. I typed the url of a page "A" that requires login. Now the page will redirect you to the login page "B". After you login successfully, you will again be redirected to the actual page that you requested, that is page A. Now your browser knows from its cache, that when you earlier requested page A, it was redirected to page B. So it will now not send a request for page A to server. It will directly load the contents of page B instead! Now browser is doing correct by trying to save network bandwidth and by trying to reduce server load. But in our case its a problem.
Now you will ask this scenario happens in most of the sites and they all work perfectly fine. The reason is that in most of the sites, when they redirect you after successful login, to page A, they change the url of A a little bit. That is mostly you will find that after you login, there will be some sid, or username appended to the url. That concatenation makes this url not same a A and so browser loads it all over again and the site seems to be working fine.(though the site maker has no idea about all this and was just lucky).
But I have a site where the url of page A does not changes. There are some restrictions and the url of Page A has to remain exactly same. So how do I solve the problem? I searched all over net but couldn't find.(though there must be some solution). But right now I am using this method:-

<html>
<body onLoad="document.form1.submit();" >
Login successful. You are being redirected to the page requested by you.
<form method="post" action="URL of the page to redirect to" name="form1">
</form>
</body>
</html>

Note that the above thing will ask the browser to use POST method. And browser knows that pages fetched with POST method are dynamic and it can't rely on cache. So it won't rely on cache and ask a copy of the page again from the server. Thats it.

PS: Even if the urls are same, browser will sometimes fetch the new copy from the server. This is just my observation. Because in my lab, I had no problem with the simple redirection. But I found that at few other places the cache copy was being used. So I had to use the post method. I guess this all depends on browser settings and proxy settings.

PS: The above problem happened with me with my IIIT LAN Browser project.

Tuesday, January 09, 2007

Shell : Its simply awesome man!

Linux lovers please don't start patting your back, because the Shell in consideration here is not the Unix shell but the shell javascript bookmarklet developed by Jesse Ruderman. Man it gives you so much insight into the webpage. You can see the total DOM structure. Though DOM Inspector does the same, but its much more easier to use the command line. And the DOM Inspector won't allow you execute commands,etc. In shell you can live edit your page.
Remember, many times you write a page with a little javascript in a console and then test it in a browser to see if your functions are working or not. Many times you don't even know whether a function exists or not. You use it and then run in browser to see if it works or not. Many times you don't know how a particular function behaves. Man this all can be done very easily in this shell.

For me the best part is :
I am new to the javascript and DOM,etc. When I code something I don't know which function would work correctly here. So I code with a lot of assumptions in my mind, and then finally when I don't see the expected results I have a tough time debugging it. This shell will now help me in this area.
I am sure if I had known about it before I made the FuckProxy and IIIT LAN Browser extensions, then I would have made them in a much more better and efficient way. And in much less time also.
Thank You Jesse Ruderman!

Friday, January 05, 2007

FuckProxy Update

FuckProxy has been updated to work with latest firefox version 2.0.0.1.
I have kept the old version number, so in order to install it, please uninstall the previous version first.
I am thinking of enabling the auto-update option so that firefox automatically looks for updates as it does for other extensions. But that will take time.
Also FuckProxy has been disabled for requests that are from outside IIIT intranet. This has been done to prevent possible misuse of IIIT resources.

Enjoy!!