New Technologies and What They Mean To You: Part 3 - CGI (Page 3 of 4 )
CGI stands for Common Gateway Interface and is a standard for running external programs from a Webserver. CGI specifies how to pass arguments to the executing program as part of the HTTP request. It also defines a set of environment variables. Normally each program is created in order to dynamically generate HTML which will be passed back to the browser, but it can also do a variety of server-side tasks such as redirects, reading/writing to text files, etc.
In order to improve performance, Netscape devised NSAPI and Microsoft developed the ISAPI standard, which allows CGI-like tasks to run as part of the main server process, thus avoiding the overhead of creating a new process to handle each CGI invocation.
What It Really Is
CGI is, more than anything, a framework. It isn’t a language at all. In fact, languages are written for use within CGI (save as CGI extensions, and are run by the CGI interpreter on the server). The most common language for use is Perl (others include C/C++, Fortran, TCL, Visual Basic, AppleScript), which we’ll cover later in this article.
CGI was developed for Unix systems in the 1980s and is still only on version 1.1. This, unlike other software, is in fact a good thing. It was designed with a specific goal in mind, and that goal hasn’t changed.
CGI is in fact extremely powerful in that it can be both compiled (when written using languages like C++) or interpreted (using languages like Perl and TCL).
More often than not, CGI scripts and applications are written in Perl. In fact, the use of Perl is so commonplace, as the language for writing CGI scripts, that most people think they are one and the same.
Can be written in a multitude of languages
Fast and efficient
Powerful and robust
Compiled or interpreted, as appropriate
Steeper learning curve than newer technologies
Less and less resources, the more “old school” it becomes