printf("goodbye, Dennis");

  G.F.        2011-10-22 12:42:22       3,049        0    

Dennis Ritchie, a father of modern computing, died on October 8th, aged 70

EVERY time you tap an iSomething, you are touching a little piece of Steve Jobs. His singular vision shaped the products Apple has conjured up, especially over the last 14 years, after Jobs returned to the helm of the company he had founded. Jobs's death in October resembled the passing of a major religious figure. But all of his technological miracles, along with a billion others sold by Apple's competitors, would be merely pretty receptacles were it not for Dennis Ritchie. It is to him that they owe their digital souls, the operating systems and programs which make them tick.

In the early 1970s Mr Ritchie invented the C programming language. It fundamentally changed how software programs are created. Its popularity stemmed from a mix of robustness and efficiency. Crucially, it was thin. In geek-speak that means it used little computing power at a time when that was in short supply while allowing programmers to control hardware directly with little effort. It was also portable. A C program written for one computer could be modified to work on another. (This is not always easy, but it is possible.)

If that were not world-changing enough, Mr Ritchie was also instrumental (with Ken Thompson and others) in developing Unix, an operating system project begun in 1969 that was originally intended to be a simpler way to run bulky mainframes. At first Unix found a home in academic institutions (Babbage used his first Sun Unix workstation at university in the 1980s) and government agencies. Then, in the 1990s, came the explosion of the internet. In 1991 Linus Torvalds, a Finnish software engineer, reinvented Unix for the internet age. The result was Linux, which worked nearly the same as its forebear and could run the same free and open-source software, but had been purged of intellectual property rights.

Most modern software code is written using C's more evolved descendants. These include C++, Objective C (which Apple favours) and C# (which rival Microsoft does). Another staple of the digital age, Java, also owes a substantial debt to Mr Ritchie's invention. Meanwhile, Unix-like systems power several hundred million Apple and Android mobile devices, most internet firms' server farms and a billion tiny gadgets, like digital video recorders and music players. There are alternatives, of course—Microsoft Windows, Nokia's Symbian or Qualcomm's BREW, among others—but their reach pales in comparison.

Mr Ritchie was modest—and deeply committed to his work, which he pursued with unebbing passion until his retirement a few years ago. His personal web page at Bell Labs, unmodified since 2006 except for the addition of a note from his siblings regarding his passing, shows a quirky character, as likely to post information about his sundry namesakes as to offer insight into his work.

His popular writings were as spare, efficient—and influential—as his coding. "The C Programming Language", a textbook he wrote with Brian Kernighan, has remained the authoritative source about all things C for over 30 years. The book introduced the first program a C coder learns:

main ( )

{

printf("hello, world");

}

which gets a computer to display the words "hello, world". (Mr Kernighan had come up with both the phrase and the task in an earlier internal manual at AT&T.) Patiently taking the reader through the rudiments of a language, with progressively harder programming tasks, was a departure from the dry, ultra-technical manuals of the day. The tens of thousands of computer books that followed all bear Mr Ritchie's mark.

Mr Ritchie and Jobs crossed paths at a crucial juncture. When Jobs was ousted from Apple in 1985 and founded NeXT, he did not create an operating system from scratch. His machines ran a version of Unix. On his triumphant return to Apple, after the company acquired NeXT in 1996, Jobs abandoned the company's ongoing effort to modernise Mac OS. He chose a version of Unix instead (and added an "X" to Mac OS) and all Macs since have relied on it. So does the iOS operating system which breathes life into iPhones and iPads. Yet for all of Mr Ritchie's groundbreaking contributions, and his key role in making Apple's gadgets what they are, his passing received precious little attention from the world's media, still preoccupied with that of the computer industry's most consummate showman.

All operating systems know when they were born. Their internal clocks start counting from then, so they can calculate the date and time in the future. It is unclear whether it was Mr Ritchie or Mr Thompson who set the so-called start Unix time at January 1st, 1970. That moment came to be known as the epoch. Mr Ritchie helped bring it about. And with it, he ushered in a new era.

Source: http://www.economist.com/blogs/babbage/2011/10/obituary-0?fsrc=nlw|newe|10-21-2011|new_on_the_economist

MEMORY  C  DENNIS RITCHIE  FATHER OF C 

       

  RELATED


  0 COMMENT


No comment for this article.



  RANDOM FUN

Breaking working feature