New Landmark's in IT

An event marking an important stage of development or a turning point in history.

Saturday, November 01, 2008

amitshah.in

New site has been launched amitshah.in

This site has extremely good collections of Articles from various stream like PHP, MySQL, Health, Jainism etc. I have also added Latest News Feed from the various News Portals which helps you to stay connect with current affairs. Photo gallery is one of good section and its come from Picasa.


Visit now....

Labels: , , , , , , ,

Monday, June 09, 2008

PHP v NET

Survey after survey continues to show the popularity of PHP over any other web based language for creating anything from the simplest single home page to complex ecommerce and enterprise solutions. Recent Netcraft surveys show PHP installations on more than twenty million (20,000,000) domains. This is by in front of the second place getter which is PERL.

So, what is it about PHP that makes it so popular and why is .NET (and other Microsoft languages), dragging the chain in server side languages. The .NET platform has many benefits, not the least of these is its speed, but webmasters the world over continue to push the PHP language to new hights of popularity.

PHP puts the "P" into what has now become popularly known as the LAMP Stack (Linux Apache MySQL PHP). With the Apache web server installed on more than half of the worlds web sites, there is really no rival for it. The Apache web server is freely downloadable, like all parts of the LAMP stack with PHP among them. PHP is easily compiled as an apache module and makes a sturdy compliment to it.

In comparison, Windows Vista comes with IIS installed freely for a single web site. Of course, you need to purchase a license for Vista before you can begin. A single server installation is rarely suitable for companies that have more than a single domain. At a minimum, most will favour registering several domains with different TLD's. This leads to requirement to puchase Windows Server, the price of which begins at approximagely USD$600.00.

Of course, the most common tasks asked of web languages these days involves database communications. Microsoft provide a limitted "express" version of their MS SQL Server for free, but most installation will require the full version which add further to the total cost of ownership.

Once again, open source projects bring a bevy of alternatives which are freely downloadable and not limitted. Among these the most popular is SUN Microsystems` MySQL database. MySQL has the benifit of being a truly cross platform application, with source code freely available and binary releases for Linux, OSX and Windows platforms, it is clearly a superior choice for many when selecting a platform to run their business or enterprise.

With the above information in mind, it is not too difficult to see that the choice of server side scripting is often not one of preference or suitablity, but one of finacial logistics. The total cost of ownership of running .NET on Windows, rather than PHP on the LAMP stack is considerable.

The most common problem encountered with .NET development is portability. The .NET model supports only Windows servers, thus making cross platform integration impossible. Should a better hosting opportunity arise on another platform, or you are forced to move to another, non-windows, platform, .NET will not function and the site owner is left with a choice of finding another Windows server, or re-write their site/application to support more open standards.

PHP on the other hand permits the site or application owner to move their code base between platforms with ease. The genuine cross platform compatibility of PHP allows developers the freedom to write code that can be run on any server supporting PHP, thus avoiding the technological lock-in incumbent of Windows and .NET. Using the same code base, PHP can be compiled and built on about 25 platforms, including most UNIXs, Windows(95/98/NT/2000) and Macs. PHP currently will load into Apache, IIS, AOLServer, Roxen and THTTPD and many other web servers. Alternatively, it can be run as a CGI module.

The technology lock-in that comes with .NET also brings with it increased cost of ownership, as previously outlined. This, in part, is due to the closed source nature of Windows and the .NET system, combined also with increased maintenance when compared to open source alternatives. PHP has been constantly the most the most popular apache scripting language according to SecuritySpace's Web Survey. This sort of popularity brings with it many more coders, thus reducing development costs at the outset as well as ongoing maintenance costs when additions are required.

Programming is programming, and many facets of programming transfer themselves across languages regardless of platform. System design and security issues have some universal traditions and taboos. Any sensible system design will make use of as many pre-existing components as possible, thus reducing development costs and development time. Invariably with .NET this means even more expenditure with the cost of each component being set by individual developers or by companies whose sole concern is to profit from .NET component development.

In stark contrast to this model, the PHP community and developers have vast repositories of classes and components freely available that cover many of the needs of most web applications.

From a security perspective, PHP is much more secure than the .NET platform when considering bugs versus fixes within the core code itself. This alone has seen sites such as facebook.com, yahoo.com, big-boards.com, gaiaonline.com, and digg.com, who all choose PHP as the preferred platform to base their enterprise on. Couple this with PHP's flexibility and speed-to-market and you can see why PHP is the choice of the web.

With the sort of market penetration that PHP has gained it is assured that it will be a strong contender for many years to come, whilst the Microsoft camp gets set to introduce yet another language to try to make up lost ground.

Labels: , , , , , , , , , , ,

Sunday, July 01, 2007

Rapid Prototyping

Rapid prototyping is the automatic construction of physical objects using solid freeform fabrication. The first techniques for rapid prototyping became available in the 1980s and were used to produce models and prototype parts. Today, they are used for a much wider range of applications and are even used to manufacture production quality parts in relatively small numbers. Some sculptors use the technology to produce complex shapes for fine arts exhibitions.

Rapid prototyping takes virtual designs from computer aided design (CAD) or animation modeling software, transforms them into cross sections, still virtual, and then creates each cross section in physical space, one after the next until the model is finished. It is a WYSIWYG process where the virtual model and the physical model correspond almost identically.

In additive fabrication, the machine reads in data from a CAD drawing and lays down successive layers of liquid or powdered material, and in this way builds up the model from a series of cross sections. These layers, which correspond to the virtual cross section from the CAD model, are glued together or fused (often using a laser) automatically to create the final shape. The primary advantage to additive construction is its ability to create almost any geometry (excluding trapped negative volumes).

The standard interface between CAD software and rapid prototyping machines is the STL file format. The word "rapid" is relative: construction of a model with contemporary machines typically takes 3 to 72 hours, depending on machine type and model size. Used in micro technologies "rapid" is correct, the products made are ready very fast and the machines can build the parts in parallel.

Some solid freeform fabrication techniques use multiple materials in the course of constructing prototypes. In some cases, the material used for the final part has a high melting point for the finished product, while the material used for its support structure has a low melting point. After the model is completed, it is heated to the point where the support material melts away, leaving a functional plastic prototype. Although traditional injection molding is still cheaper for manufacturing plastic products, rapid prototyping may be used to produce finished goods in a single step.

There are currently several projects to improve rapid prototyping technology to the stage where a prototyping machine can manufacture a majority of its own component parts.[1][2] Of these the RepRap Project is probably the most advanced. The idea behind this is that a new machine can be almost entirely manufactured and assembled quite inexpensively from the same polymer filament feed stock that the rapid prototyping machine uses to make prototypes by the owner of an existing one. Such a 'self-replication' technique will considerably reduce the cost of prototyping machines in the future, and hence any objects they are capable of manufacturing.

Saturday, June 23, 2007

Load balancing (computing)

In computer networking, load balancing is a technique (usually performed by load balancers) to spread work between many computers, processes, hard disks or other resources in order to get optimal resource utilization and decrease computing time.

A load balancer can be used to increase the capacity of a server farm beyond that of a single server. It can also allow the service to continue even in the face of server down time due to server failure or server maintenance.

A load balancer consists of a virtual server (also referred to as vserver or VIP) which, in turn, consists of an IP address and port. This virtual server is bound to a number of physical services running on the physical servers in a server farm. These physical services contain the physical server's IP address and port. A client sends a request to the virtual server, which in turn selects a physical server in the server farm and directs this request to the selected physical server. Load balancers are sometimes referred to as "directors"; while originally a marketing name chosen by various companies, it also reflects the load balancer's role in managing connections between clients and servers.

Different virtual servers can be configured for different sets of physical services, such as TCP and UDP services in general. Protocol- or application-specific virtual servers that may be supported include HTTP, FTP, SSL, SSL BRIDGE, SSL TCP, NNTP, SIP, and DNS.

The load balancing methods (listed below) manage the selection of an appropriate physical server in a server farm. Load balancers also perform server monitoring of services in a web server farm. In case of failure of a service, the load balancer continues to perform load balancing across the remaining services that are UP. In case of failure of all the servers bound to a virtual server, requests may be sent to a backup virtual server (if configured) or optionally redirected to a configured URL. For example, a page on a local or remote server which provides information on the site maintenance or outage.

Among the server types that may be load balanced are:

In Global Server Load Balancing (GSLB) (also known as Global Traffic Management) the load balancer distributes load to a geographically distributed set of server farms based on health, server load or proximity.

Saturday, September 30, 2006

Free Antivirus Detection and Removal Tools

Software To Protect You From Viruses, Trojans, Worms and Other Malware

AVG
Here, you can get your free copy of the AVG 7.0 Anti-Virus System - AVG 7.0 Free Edition and you will be able to use it without any limitations for life of the product.

Avast Home Edition
avast! 4 Home Edition is a free antivirus software for home noncommercial use. It scans for viruses, worms and Trojans on disk, CDs, in E-mail, IM and P2P . Incremental updates of virus database (twice a week) are small, fast and reliable.

AntiVir Personal Edition
The AntiVir Personal Edition offers the effective protection against computer viruses for the individual and private use on a single PC-workstation.

Clam Antivirus
ClamWin is a Free Antivirus for Microsoft Windows NT/98/Me/2000/XP/2003. It provides a graphical user interface to the Clam AntiVirus scanning engine.

ScripTrap
ScripTrap traps scripts when they attempt to run on your computer and provides the option of blocking them or letting them continue to run. You can also check the intercepted script with your anti-virus program before you decide to run it or not.

Trend Micro Online Scan
Give your PC a FREE check-up! HouseCall is a demonstration of the power of Web-based technologies that Trend Micro is developing to make deployment and management of virus protection in corporate settings fast and easy.

McAfee Stinger Virus Removal Tool
Stinger is a stand-alone utility used to detect and remove specific viruses. It is not a substitute for full anti-virus protection, but rather a tool to assist administrators and users when dealing with an infected system.

Symantec Virus Removal Tools
Symantec Security Response has developed tools to automatically conduct what would often amount to extensive and tedious manual removal tasks. Check this link for a list of virus removal tools.

BitDefender Virus Removal Tools
SOFTWIN provides you with a powerful set of Virus Cleaning Tools, designed to detect and remove viruses that infected your system. These applications are also valuable because of their size, making them easily downloadable even with a slow Internet connection. Check this link for a list of virus removal tools.

Tuesday, August 15, 2006

Top 10 Data Recovery Bloopers

Truth, as the saying goes, is stranger than fiction. The following horror stories are true. The identities of those involved have been omitted, because what happened to them could happen to anyone.
1) It's the Simple Things That Matter
The client, a successful business organization, purchased a "killer" UNIX network system, and put 300+ workers in place to manage it. Backups were done daily. Unfortunately, no one thought to put in place a system to restore the data to.
2) In a Crisis, People Do Silly Things
The prime server in a large urban hospital's system crashed. When minor errors started occurring, system operators, instead of gathering data about the errors, tried anything and everything, including repeatedly invoking a controller function which erased the entire RAID array data.
3) When the Crisis Deepens, People Do Sillier Things
When the office of a civil engineering firm was devastated by floods, its owners sent 17 soaked disks from three RAID arrays to a data recovery lab in plastic bags. For some reason, someone had frozen the bags before shipping them. As the disks thawed, even more damage was done.
4) Buy Cheap, Pay Dearly
The organization bought an IBM system - but not from IBM. Then the system manager decided to configure the system uniquely, rather than following set procedures. When things went wrong with the system, it was next to impossible to recreate the configuration.
5) An Almost Perfect Plan
The company purchased and configured a high-end, expensive, and full-featured library for the company's system backups. Unfortunately, the backup library was placed right beside the primary system. When the primary system got fried, so too did the backup library.
6) The Truth, and Nothing But the Truth
After a data loss crisis, the company CEO and the IT staffer met with the data recovery team. No progress was made until the CEO was persuaded to leave the room. Then the IT staffer opened up, and solutions were developed.
7) Lights Are On, But No One's Home
A regional-wide ambulance monitoring system suffered a serious disk failure, only to discover that its automated backup hadn't run for fourteen months. A tape had jammed in the drive, but no one had noticed.
8) When Worlds Collide
The company's high-level IT executives purchased a "Cadillac" system, without knowing much about it. System implementation was left to a young and inexperienced IT team. When the crisis came, neither group could talk to the other about the system.
9) Hit Restore and All Will Be Well
After September's WTC attacks, the company's IT staff went across town to their backup system. They invoked Restore, and proceed to overwrite from the destroyed main system. Of course, all previous backups were lost.
10) People Are the Problem, Not Technology
Disk drives today are typically reliable - human beings aren't. A recent study found that approximately 15 percent of all unplanned downtime occurs because of human error.

Thursday, June 29, 2006

AJAX - Executive Summary

AJAX (Asynchronous Javascript and XML) is a software platform to build and deploy Rich Internet Applications.(RIA) Some RIA applications include Flash, Java Applets and ActiveX Controls.

Without these kinds of applications, the clients are thin and monolithic, as in a Legacy Architecture. RIA's put the smile back on the client machines, by maximizing their CPU utilization. You have had the scotch and the bourbon. Be ready to taste a cocktail now. Enter AJAX!

While AJAX represents a culmination of well known technologies like JavaScript, DOM, HTML and XML, it is not a language by itself. So what's all the hype about? Let me explain :

The biggest problem that we face in web applications is the time taken for reloading Web Pages. This depending on what you have put on the page and the speed of your connection, can range from average to slow. In the meanwhile, think of an application where the page is refreshed in an instant. Or if the page has to come from the server, it does so in the background. So no more hour glasses, so to say, you see. A big headache goes away.

Now, imagine having the ability to use the following browsers: IE, Firefox, Safari and Opera (7.0 and above) without worrying about the quirks of each one of them. Think of the page getting refreshed intelligently depending on the type of your internet connection (dial-up , broad band etc.) Move away from page based design to event controlled interface. Intelligent User interface, which minimizes the number of clicks, for a rich user experience. And all this, without having to procure a plug-in or an applet and based on Open Standards.

Sounds like a dream. Please welcome AJAX! Again.

A small detour : As per Greek Mythology, Ajax was the son of Telamon, the king of Salamis. He was a mighty hero of the Trojan War.

When Netscape introduced the word layer in 1998, Microsoft went ahead with its DOM (Document Object Model), which relied on the new method called document.all in JavaScript. As Netscape lost the war to Microsoft (in early 2000) , they were unable to push the layer idea further, to appeal to the DHTML band of workers. Although, some thing similar is available in a rudimentary form - called 'DIV' tags.

An Ajax application does away with the start and stop repetitive interaction of a client and a server. Now instead of loading a webpage, at a session start, the browser loads an Ajax engine made of Javascript, safely tucked away in a hidden frame. Ajax applications manipulate DOM to control the UI and backend requests which were made as minimal as possible. Thus one could do anything like manipulate web forms or search queries or modify a record etc. In effect the user's interaction with the application happens asynchronously, independent of server communication. The core of AJAX lies in a Browser based object named XMLHttpRequest which allows the browser to invoke remote calls back to the server without the need to reload the current page. Some of the functionality allows features like Drag and Drop, In Place Editing and Customized user views.

Now AJAX is not all 'rosy' as it seems. There are worries that AJAX may interfere with the Browser Back Button. Javascript has to be included manually in the scripts, sometimes using ActiveX (For IE 6.0 and below). People who would use AJAX would have to take care of network latency, through code. If this is not done well, there can be delays.

There are quite some number of AJAX IDE's like BackBase, JackBe and some others. Some of the tools have a visual workflow. Google uses AJAX in GoogleSuggest and GoogleMaps. Others have also started to get a feel of this powerful paradigm.

By 2010, it is expected that 60% of all applications on the web would be RIA's. Ajax will definitely be one of them, if not the leader. The others like XUL and XAML (part of Avalon - Microsoft) are also contenders.