MS-DOS History

In 1980, when IBM was building its new personal computer code-named "Acorn", they needed an operating system and a computer language.
At the time one of the most successful computer languages for microcomputers was Altair BASIC, written by Microsoft, and the dominant operating system CP/M, a product of Gary Kildall's Digital Research.
In 1980, IBM approached Bill Gates to write BASIC for their new computer and approached Gary Kildall for the operating system.
Microsoft was only too happy to comply. They had already created a version of BASIC for the 8086, which had been displayed by Seattle Computer Products, a computer hardware company, at the 1979 National Computer Conference.
Legend has it, that when IBM first approached Digital Research, Kildall was out of the office flying one of his planes, and never met them. This was not strictly true - although he was out flying, he showed up for the meeting, a little late. He had a discussion with the IBM representatives on a flight back to their office in Florida. They never reached a deal. Kildall wanted more than the $200,000 IBM was willing to pay, to get a royalty-free license in perpetuity.
So IBM went to back to Bill Gates, and asked whether Microsoft could do the operating system as well.
At the time, Tim Paterson of Seattle Computer Products had written an operating system called "QDOS," an acronym for "quick and dirty operating system, for the 16-bit, Intel 8086 (according to Byte Magazine it was “thrown together in two man-months”). QDOS was in every important respect a clone of CP/M rewritten for the 8086.
Microsoft purchased the rights to QDOS from SCP for $50,000, and tweaked it so that it could run on the 8088. (The 8088 chip was a subset of the 8086 family and could run on the same software with minor alterations).
Gates sold DOS to IBM for $50,000 and persuaded them that he should retain the rights to license the system to other computer manufacturers.
The rest as they say is history. The ‘other computer manufacturers’ made the millions of PC clones, and had to buy a copy of MS-DOS from Microsoft for every PC they sold.
Kildall was furious. He claimed that DOS was a copy of all the best features of CP/M, but unfortunately software copyright law was not mature at the time, and there was precious little he could do about it.
****
In 1981, Tim Paterson quit Seattle Computer Products and found employment at Microsoft.
In 1994, Gary Kildall, by now an embittered man struggling with alcohol, died in a Monterey bar from injuries sustained to his head. An inquest called the death "suspicious," but no one was charged.
An Inside Look at MS-DOS
The Dross of the DOS
The Man Who Could Have Been Bill Gates
***

Response to LIB$DAY error report

They don't answer user complaints like they used to. The following is a response dated 13-Oct-1983 by Stanley Rabinowitz at DEC.
----------------------------------------------------------
SPR PROBLEM ABSTRACT:
User claims year 2000 should not be a leap year.

SPR ANSWER FORM
SPR NO. 11-60903
SYSTEM VERSION PRODUCT VERSION COMPONENT
SOFTWARE: VAX/VMS V3.2 VAX/VMS V3.2 Run-Time Library

PROBLEM:
The LIB$DAY Run-Time Library service "incorrectly" assumes the year 2000 is a leap year.

RESPONSE:
Thank you for your forward-looking SPR.
Various system services, such as SYS$ASCTIM assume that the year 2000 will be a leap year. Although one can never be sure of what will happen at some future time, there is strong historical precedent for presuming that the present Gregorian calendar will still be in affect by the year 2000. Since we also hope that VMS will still be around by then, we have chosen to adhere to these precedents.
The purpose of a calendar is to reckon time in advance, to show how many days have to elapse until a certain event takes place in the future, such as the harvest or the release of VMS V4. The earliest vcalendars, naturally, were crude and tended to be based upon the seasons or the lunar cycle.
The calendar of the Assyrians, for example, was based upon the phases of the moon. They knew that a lunation (the time from one full moon to the next) was 29 1/2 days long, so their lunar year had a duration of 364 days. This fell short of the solar year by about 11 days. (The exact time for the solar year is approximately 365 days, 5 hours, 48 minutes, and 46 seconds.) After 3 years, such a lunar calendar would be off by a whole month, so the Assyrians added an extra month from time to time to keep their calendar in synchronization with the seasons.
The best approximation that was possible in antiquity was a 19-year period, with 7 of these 19 years having 13 months (leap months). This scheme was adopted as the basis for the religious calendar used by the Jews. (The Arabs also used this calendar until Mohammed forbade shifting from 12 months to 13 months.)
When Rome emerged as a world power, the difficulties of making a calendar were well known, but the Romans complicated their lives because of their superstition that even numbers were unlucky. Hence their months were 29 or 31 days long, with the exception of February, which had 28 days. Every second year, the Roman calendar included an extra month called Mercedonius of 22 or 23 days to keep up with the solar year.
Even this algorithm was very poor, so that in 45 BC, Caesar, advised by the astronomer Sosigenes, ordered a sweeping reform. By imperial decree, one year was made 445 days long to bring the calendar back in step with the seasons. The new calendar, similar to the one we now use was called the Julian calendar (named after Julius Caesar). It's months were 30 or 31 days in length and every fourth year was made a leap year (having 366 days). Caesar also decreed that the year would start with the first of January, not the vernal equinox in late March.
Caesar's year was 11 1/2 minutes short of the calculations recommended by Sosigenes and eventually the date of the vernal equinox began to drift. Roger Bacon became alarmed and sent a note to Pope Clement IV, who apparently was not impressed. Pope Sixtus IV later became convinced that another reform was needed and called the German astronomer, Regiomontanus, to Rome to advise him. Unfortunately, Regiomontanus died of the plague shortly thereafter and the plans died as well.
In 1545, the Council of Trent authorized Pope Gregory XIII to reform the calendar once more. Most of the mathematical work was done by Father Christopher Clavius, S.J. The immediate correction that was adopted was that Thursday, October 4, 1582 was to be the last day of the Julian calendar. The next day was Friday, with the date of October 15. For long range accuracy, a formula suggested by the Vatican librarian Aloysius Giglio was adopted. It said that every fourth year is a leap year except for century years that are not divisible by 400.
Thus 1700, 1800 and 1900 would not be leap years, but 2000 would be a leap year since 2000 is divisible by 400. This rule eliminates 3 leap years every 4 centuries, making the calendar sufficiently correct for most ordinary purposes. This calendar is known as the Gregorian calendar and is the one that we now use today. (It is interesting to note that in 1582, all the Protestant princes ignored the papal decree and so many countries continued to use the Julian calendar until either 1698 or 1752. In Russia, it needed the revolution to introduce the Gregorian calendar in 1918.)
This explains why VMS chooses to treat the year 2000 as a leap year.
Despite the great accuracy of the Gregorian calendar, it still falls behind very slightly every few years. If you are very concerned about this problem, we suggest that you tune in short wave radio station WWV, which broadcasts official time signals for use in the United States. About once every 3 years, they declare a leap second at which time you should be careful to adjust your system clock. If you have trouble picking up their signals, we suggest you purchase an atomic clock (not manufactured by Digital and not a VAX option at this time).

END OF SPR RESPONSE

History of SQL

In June, 1970 Dr. Edgar F. Codd published a seminal paper, "A Relational Model of Data for Large Shared Data Banks". Codd's model became widely accepted as the definitive model for relational database management systems (RDBMS). After Codd published this paper, two projects were started to test its viability: Ingress at UC Berkeley in 1970, and later SystemR at IBM's San Jose research center in 1974-75. Ingress (INteractive Graphics REtrieval System) used QUEL (QUEry Language) as a query language and SystemR used SEQUEL.
The term SEQUEL was originally coined as a pun on QUEL (since it came after QUEL it was named sequel). In 1977, Revised SEQUEL/2 was defined. This was later renamed to SQL due to a trademark dispute (the word 'SEQUEL' was held as a trademark by the Hawker-Siddeley aircraft company of the UK). Although these query languages were greatly influenced by Codd's work, they were not designed by Codd himself; the QUEL language design was due to Michael Stonebraker at UC Berkeley, and the SEQUEL language design was due to Donald Chamberlin and Raymond Boyce at IBM. IBM published their concepts to increase interest in SEQUEL (later SQL).

Milestones in RDBMS development
1970 Dr. E. F. Codd publishes his first paper on the relational model
UC Berkeley INGRES prototype work begins
1974 IBM SEQUEL language and prototype developed
IBM System R Prototype work begins
1977 Relational Software Inc. (RSI) founded
Revised SEQUEL/2 (subsequently renamed SQL) defined
1979 Teradata Corporation formed
Britton-Lee, Inc. (later renamed ShareBase)formed
Oracle released by RSI (now Oracle Corporation)
1981 SQL/DS for VSE announced by IBM
INGRES for VAX/VMS announced by Oracle Corporation
1983 DB2 for MVS announced by IBM
1984 First DBC/1012 database machine shipped by Teradata
1985 Teradata acquired Britton-Lee
1986 First version of SQL standard released
Sybase Inc. formed
1987 NonStop SQL announced by Tandem
1988 Microsoft, Sybase and Ashton-Tate develop Sybase for OS/2
1992 AT&T purchases NCR and Teradata
1993 Microsoft and Sybase end partnership
Microsoft rebrands Sybase as SQL Server and releases Windows version
1995 Computer Associates acquires INGRES as a part of its Ask Group purchase
1997 NCR becomes independent company
1998 In-database OLAP and data mining appear in RDBMSs
2000 RDBMSs continue to add OO capabilities and support for complex data
2001 Native XML support is provided for the first time in an RDBMS
2003 W3C enhances XQuery, the XML query language
2004 SQL:2003 standard is published

Out of Africa and into India?

A recent study (February 2006) published in The American Journal of Human Genetics suggests that the origins of Indians are largely indigenous, dating back to the Palaeolithic period. According to the study, the genetic influence of the Indo-European-speakers who established the caste system was small.
Their conclusion is that while some lineages did move in from the outside, many of the major ones are likely to have arisen within India. Specifically, five major haplogroups (C5-M356, F*-M89, H-M69, L1-M76 and R2-M124) were likely to have originated within the subcontinent. The origins of one common haplogroup (R1a1-M17) could not be determined.
(- Polarity and Temporality of High-Resolution Y-Chromosome Distributions in India Identify Both Indigenous and Exogenous Expansions and Reveal Minor Genetic Influence of Central Asian Pastoralists, S. Sengupta, et al.)

Modern humans can be traced back to a group of people who lived in Africa around 100,000 years ago. We know this by analyzing genetic markers in the Y-chromosome and mtDNA (mitochondrial DNA) structure of contemporary populations and projecting them back in time. People with similar sets of genetic markers are grouped into haplogroups and links are established between the members of a haplogroup and the marker's first appearance in the group's most recent common ancestor (MRCA).
Determining the origination of a genetic lineage and its subsequent spread is a science fraught with uncertainty. In its simplest form current high frequency and high diversity may mark the origin of a lineage. But there are other ways by which these can arise – high frequency by genetic drift, and high diversity by admixture. And of course there is the fact that people move. There is high frequency and high diversity of genetic markers in Australia. Yet there is no origination in Australia theory.
The time is generally calculated by imputing mutation rates. There are a variety of different techniques, so molecular dates tend to be less certain than archaeological ones.
Notwithstanding the dangers, there is growing genetic evidence that the subcontinent of India has been a major corridor for the migration of people between Africa and the rest of the world.

Studies of maternally inherited mitochondrial DNA are revealing the excursion choices of our earliest ancestors. In their Perspective, Forster and Matsumura discuss two new studies of the mitochondrial DNA of the indigenous peoples of Malaysia and the Andaman islands (Macaulay et al., Thangaraj et al.). These studies suggest that the earliest humans took a southern route along the coastline of the Indian Ocean before fanning out over the rest of the world. - Forster P and Matsumura S. 2005. Did early humans go north or south? Science 308:965-966. Science Online