That's a major milestone for any open source software project, and I'm proud of the work that we've done on it over the past quarter century.
I'm also proud of how we built FreeDOS because it is a great example of how the open source software model works. MS-DOS provided a flexible command line, which I quite liked and that came in handy to manipulate my files. Over the years, I learned how to write my own utilities in C to expand its command-line capabilities even further. But I liked DOS. I figured that if we wanted to keep DOS, we would need to write our own.
And that's how FreeDOS was born. On June 29, , I made a small announcement about my idea to the comp. The general support for this at the time was strong, and many people agreed with the statement, "start writing! I have written up a "manifest" describing the goals of such a project and an outline of the work, as well as a "task list" that shows exactly what needs to be written.
I'll post those here, and let discussion follow. I started working on it right away. I contributed over a dozen FreeDOS utilities. By sharing my utilities, I gave other developers a starting point. Other developers who saw FreeDOS taking shape contacted me and wanted to help. Others contributed utilities that replicated or expanded the DOS command line. We released our first alpha version as soon as possible. Less than three months after announcing FreeDOS, we had an Alpha 1 distribution that collected our utilities.
New developers joined the project, and we welcomed them. You may be familiar with other milestones. We crept our way towards the 1. MS-DOS stopped being a moving target long ago, so we didn't need to update as frequently after the 1. FreeDOS supports networking and even provides a simple graphical web browser Dillo. And we have tons of new utilities, including many that will make Linux users feel at home.
FreeDOS got where it is because developers worked together to create something. In the spirit of open source software, we contributed to each other's work by fixing bugs and adding new features. We treated our users as co-developers; we always found ways to include people, whether they were writing code or writing documentation. And we made decisions through consensus based on merit. If that sounds familiar, it's because those are the core values of open source software: transparency, collaboration, release early and often, meritocracy, and community.
That's the open source way! Microsoft co-founder Paul Allen died today from complications with non-Hodgkin's lymphoma. He was Allen said earlier this month that he was being treated for the disease.
Allen was a childhood friend of Bill Gates, and together, the two started Microsoft in He left the company in while being treated for Hodgkin's lymphoma and remained a board member with the company through He was first treated for non-Hodgkin's lymphoma in , before seeing it go into remission.
In a statement given to ABC News , Gates said he was "heartbroken by the passing of one of my oldest and dearest friends.
From our early days together at Lakeside School, through our partnership in the creation of Microsoft, to some of our joint philanthropic projects over the years, Paul was a true partner and dear friend. Personal computing would not have existed without him.
But Paul wasn't content with starting one company. He channelled his intellect and compassion into a second act focused on improving people's lives and strengthening communities in Seattle and around the world.
He was fond of saying, "If it has the potential to do good, then we should do it. Paul loved life and those around him, and we all cherished him in return. He deserved much more time, but his contributions to the world of technology and philanthropy will live on for generations to come.
I will miss him tremendously. Paul Allen's contributions to our company, our industry, and to our community are indispensable. As co-founder of Microsoft, in his own quiet and persistent way, he created magical products, experiences and institutions, and in doing so, he changed the world.
I have learned so much from him -- his inquisitiveness, curiosity, and push for high standards is something that will continue to inspire me and all of us as Microsoft. Our hearts are with Paul's family and loved ones. Rest in peace. In a memoir published in , Allen says that he was responsible for naming Microsoft and creating the two-button mouse.
The book also portrayed Allen as going under-credited for his work at Microsoft, and Gates as having taken more ownership of the company than he deserved. It created some drama when it arrived, but the two men ultimately appeared to remain friends, posing for a photo together two years later.
After leaving Microsoft, Allen became an investor through his company Vulcan, buying into a diverse set of companies and markets. Vulcan's current portfolio ranges from the Museum of Pop Culture in Seattle, to a group focused on using machine learning for climate preservation, to Stratolaunch, which is creating a spaceplane.
Allen's investments and donations made him a major name in Seattle, where much of his work was focused. NFL Commissioner Roger Goodell said Allen "worked tirelessly" to "identify new ways to make the game safer and protect our players from unnecessary risk. He also launched a number of philanthropic efforts, which were later combined under the name Paul G.
Allen Philanthropies. My brother was a remarkable individual on every level. While most knew Paul Allen as a technologist and philanthropist, for us he was a much loved brother and uncle, and an exceptional friend.
Paul's family and friends were blessed to experience his wit, warmth, his generosity and deep concern. For all the demands on his schedule, there was always time for family and friends. At this time of loss and grief for us — and so many others — we are profoundly grateful for the care and concern he demonstrated every day. One recent project, the Allen Brain Observatory , provides an open-access "catalogue of activity in the mouse's brain," Saskia de Vries, senior scientist on the project, said in a video.
That kind of data is key to piecing together how the brain processes information. In an interview with Matthew Herper at Forbes , Allen called the brain "hideously complex" -- much more so than a computer. Allen Frontiers Group in , which funds cutting-edge research. Even back in , when Allen spoke with Herper at Forbes , he talked about plans for his financial legacy after his death -- and he said that a large part of it would be "allocated to this kind of work for the future. Paul's vision and insight have been an inspiration to me and to many others both here at the Institute that bears his name, and in the myriad of other areas that made up the fantastic universe of his interests.
He will be sorely missed. We honor his legacy today, and every day into the long future of the Allen Institute, by carrying out our mission of tackling the hard problems in bioscience and making a significant difference in our respective fields.
According to Quincy Jones, Allen was also an excellent guitar player. Man what a shock! He liked to swing by office on a regular basis as we were just a few blocks from Dicks hamburgers on Mercer St his favorite. He was really an engineer's engineer. We'd give him a status report on how things were going and within a few minutes he was up at the white board spitballing technical solutions to ASIC or network problems. I especially remember him coming by the day he bought the Seahawks.
Paul was a big physical presence 6'2" lbs in those days , but he kept going on about how after meeting the Seahawks players, he never felt so physically small in his life. Ignore the internet trolls. Paul was a good guy. He was a humble, modest, down-to-earth guy. There was always a pick-up basketball game on his court on Thursday nights. Jam session over at his place were legendary I never got to play with him, but every musician that I know that played with him was impressed with his guitar playing.
He left a huge legacy in the pacific northwest. We'll miss you Paul! The book Paul Allen wrote avoids a full report, but gives the impression that Bill Gates was so angry, Paul Allen left the company because interacting with Bill Gates was bad for his health.
Quotes from the book, Idea Man [amazon. His reply: "It was neat that they got along well enough that the company didn't explode in the first year or two.
When Bill pushed on licensing terms or bad-mouthed the flaky Signetics cards, Ed thought he was insubordinate. You could hear them yelling throughout the plant, and it was quite a spectacle-the burly ex-military officer standing toe to toe with the owlish prodigy about half his weight, neither giving an inch.
At product review meetings, his scathing critiques became a perverse badge of honor. One game was to count how many times Bill confronted a given manager; whoever got tagged for the most "stupidest things " won the contest.
He used to have the nickname "Doctor NetVorkian" because many of the things he invested in promptly tanked in one way or another after his investment. He had a lot of bad luck with his investments. For those who don't understand the joke, a certain Dr. Kervorkian became notorious for helping ill patients commit suicide. But you can wipe Windows off your hard drive, so I don't get your point. Paul Allen was a great guy in many, many ways.
Even if you could "blame" him for all or part of Windows, he did start the Museum of Pop Culture [wikipedia. If you are ever in Seattle, it is a must see. I mean, they have what is probably the best Star Trek museum display anywhere which is saying a lot since the Smithsonian has a very nice one as well , including most of the original series set pieces and I believe one of the only actual Enterprise models used for filming.
In my mind, that gives him a great deal of geek cred. Plus, as I under. I knew someone would say that. You are right. I won't. But he won't either. He was a patent troll. Oh but: RIP and thoughts and prayers, right? He was a great guy and will be missed. Since he's been overseeing an open source project that maintains a replacement for the MS-DOS operating system, and has just announced the release of the "updated, more modern" FreeDOS 1.
FreeDOS 1. And you can find more tools and games, and a few graphical desktop options including OpenGEM. But the first thing you'll probably notice is the all-new new installer that makes it much easier to install FreeDOS. All of these have ibmbio. COM, These come with a collection of pcdos files, and a few others. EXE E. COM E. SYS FC. COM There are other task-specific utilities, some of which seem to have general application. These files write new boot sectors. You can use bootpart or my modified ibmpart to move the necessary files into place.
I installed all of these boot sectors onto a vm, and then reverted to pcdos The vm rebooted perfectly. It tells you how to change the version to any pcdos or msdos number even things like pcdos 5. Still, it's handy since you can test msdos 6. I made versions for all pcdos and msdos ibm, , , , , , , , , and msdos , , , , , , Other dos versions like 6. There's room to handle dos versions like Beats setver.
Its latest publicly available build is from December - new enough in my humble opinion. Source is free for OpenDOS 7. If they could, Udo Kuhnt would be in a lot of trouble. Snipped from the OpenDOS 7. The evaluation period for use by or on behalf of a commercial entity is limited to 90 days; evaluation use by others is not subject to this 90 day limit but is still limited to a reasonable period. E Editor - Text Editor with menus, math, and mouse support.
Supports editing and viewing of multiple files simultaneously. I have a floppy A and 2 hard drives C, D. My DVD burner is E. Here's the original announcement PDF. Rather than create the from scratch, Estridge's engineers used existing parts from a variety of other companies, seemingly in marked contrast with IBM tradition.
The company made a virtue out of the fact that it made the components used in its machines. DOS does not come with them preinstalled. And just remember how WordPerfect 5. Microsoft was already well established in the market by that point. Everyone used Microsoft.
It's a weird way to brand your product. Well, I remember when I was a kid, the computer world was very fragmented. Great game came out? Odds are it won't run on the system that YOU have.
As much as I generally dislike the major players, at least there are only three major platforms that you have to develop for. In fact, you can develop a game for only one market, and still have the opportunity to make quite a bit of money.
IBM chose to use 5" double-density soft-sectored disks with 40 tracks, 8 sectors per track, and bytes per sector. This gave their disk a total capacity of , bytes, known as the "k disk". In the spring of I left Seattle Computer to work for Microsoft. This work was generally driven by bug reports and feature requests from IBM, and I stuck to doing the work I was asked to do. But eventually I got up the nerve to ask: Why were there only 8 sectors per track on the IBM floppy disk?
We had worked with many disk formats at Seattle Computer, and I had gotten deep into the nitty-gritty of how they were laid out. A 5" disk spun at RPM, or 0. The double-density bit rate was kbps, meaning there was time for 50, raw bits, or bytes, in one revolution.
There was considerable overhead for each sector, something like bytes. Adding a 9th sector on the track would increase this to bytes, still far less than the total available. You could almost fit a 10th sector, but not quite or maybe you could reduce the overhead and make 10 fit!
IBM's response was something like "Oh, really? It was also too late to put a change like this into the product it was probably 2 — 3 months before the PC shipped. They said they would save it as a new, upgraded feature for DOS 1.
IBM refreshed the Personal Computer line around 6 months after the initial release. This not only included DOS 1. For compatibility, they still had the single-sided format with 8 sectors per track — the k disk. And of course new machines would want to use the double-sided 9-sector format — the k disk. For some reason, IBM also supported a double-sided 8-sector format — the k disk — which served no useful purpose.
With it came the high-capacity 5" floppy disk. It basically gave the 5" disk the same specifications as the older 8" disks — doubling the data rate and spinning the disk at the faster RPM. Capacity increased to 1. Eventually the 5" floppy disk was replaced by the 3. I still see a few of my computers around the office with 3. September 30, DosMan Drivel.
File System Performance An important design parameter for the OS was performance, which drove my choice for the file system. I had learned a handful of file management techniques, and I spent some time analyzing what I knew before making a choice.
That is, each file occupies consecutive sectors on the disk. The disk directory only needs to keep track of the first sector and the length for each file — very compact. Random access to file data is just as fast as sequential access, because it's trivial to compute the sector you want. But the big drawback is that once a file is boxed in by other files on the disk, it can't grow. The whole file would then have to be moved to a new spot with more contiguous free space, with the old location leaving a "hole".
After a while, all that's left are the holes. Then you have to do a time-consuming "pack" to shuffle all the files together and close the holes. I decided the drawbacks of contiguous allocation were too severe. UNIX uses a clever multi-tiered approach. For small files, the directory entry for a file has a short table of the sectors that make up the file. These sectors don't have to be contiguous, so it's easy to extend the file.
If the file gets too large for the list to fit in the table, UNIX adds a tier. The sectors listed in the table no longer reference data in the file; instead, each entry identifies a sector which itself contains nothing but a list of sectors of file data.
If the file gets huge, yet another tier is added — the table entries each reference a sector whose entries reference a sector whose entries identify the file data. Random access to file data is very fast for small files, but as the files get larger and the number of tiers grow, if will take one or two additional disk reads just to find the location of the data you really want. Instead it grouped sectors together into a "cluster" or "allocation unit".
By making each cluster 8 sectors 1K , there were less than clusters on a disk. Thus clusters could be indentified using only one byte. But when a file exceeded 16K, it needed a whole new directory entry to store an additional 16K of cluster numbers.
There was no link between these entries; they simply contained the same name and a number indentifying which section of the file it represented the "extent". This led to a potential performance nightmare, especially for random access.
When switching between extents, the system had to perform its standard linear search of the directory for a file of the correct name and extent. This search could take multiple disk reads before the requested data was located. Unlike all the other file systems, the FAT system separates the directory entry which has the file name, file size, etc.
This small size also meant it was practical, even with the limited memory of the time, to keep the entire FAT in memory at all times. To me, the big appeal of the FAT system was that you never had to read the disk just to find the location of the data you really wanted. FAT entries are in a chain — you can't get to the end without visiting every entry in between — so it is possible the OS would have to pass through many entries finding the location of the data.
But with the FAT entirely in memory, passing through a long chain would still be times faster than a single sector read from a floppy disk. Another thing I liked about FAT was its space efficiency. There were no tables of sectors or clusters that might be half full because the file wasn't big enough to need them all. The size of the FAT was set by the size of the disk. When I designed DOS I knew that fitting the cluster number in a single byte, limiting the number of clusters to , wouldn't get the job done as disks got bigger.
I increased the FAT entry to 12 bits, allowing over clusters. With a cluster size of as much as 16K bytes, this would allow for disks as large as 64MB. You could even push it to a 32K cluster and MB disk size, although that large cluster could waste a lot space. These disk sizes seemed enormous to me in Only recently had we seen the first 10MB hard disks come out for microcomputers, and that size seemed absurdly lavish and expensive.
Obviously I'm no visionary. Disk size has grown faster than any other computer attribute, up by a factor of , Typical memory size is up by a factor of 30,, while clock speed is only up x. Microsoft extended the FAT entry to 16 bits, then to 32 bits to keep up. But this made the FAT so large that it was no longer kept entirely in memory, taking away the performance advantage.
I don't even know where to begin, given that I am writing to the creator of DOS. But, to start I'll say it is good to see this blog, and I hope to see more from you here I definitely bookmarked it.
What I suspect actually happened was just a case of sour grapes. That's all, I doubt any high principles were at work here. DRI just lost a lot of money. You emphasize the FAT, of course, but also better device error handling comes to mind. Ironic, huh? I noticed also the "feud" with this fool Wharton. An article I found on the subject mentioned him as a "pal" of Kildall's. It is reasonable to expect he is biased on the matter, given this. You should be able to find the article as it is titled something like "The man who could have been Bill Gates.
Despite an interest in this issue, I personally am still "turned off" by all this bickering and fighting over money, fame, and whatever. I sometimes wonder why we can't all just get along and enjoy doing and tinkering with cool and interesting things? Then, the answer comes to me. Granted, being an active! I'm glad you provided something that, perhaps incidentally, was better than what we could have had. April 16, DosMan Drivel. The rest, as they say, is history. MS-DOS 3. It's also around this time that developers start to feel the pinch of the KB conventional memory limit imposed by IBM's original hardware specifications.
Every other version of DOS was quickly squished out of existence by Windows 95, and it wouldn't be until the late 90s and the emergence of the Dot Com Bubble that another command-line OS would yet again rise to prominence in the shape of Linux. Wikipedia has an excellent account of the history of x86 DOS operating systems , and also a table that compares and contrasts each of the different versions from IBM, MS, Digital Research, and others. If you're interested in the original development of QDOS, check out its creator's blog.
In , I was asked by a magazine editor to write an article about data compression. I wrote a manuscript and an accompanying program, sent them to the editor, and forgot about them. The next time I heard from him I was told that the magazine was discontinued. That was May 1, Soon a number of hobby programmers gathered and began improving on that program. Miki was then a medical specialist working at a governmental office.
The LZSS algorithm is based on a very simple idea. Suppose I'm going to write "compression" here. But probably I've already used that word before in this file. In Storer's [8] terminology, this is a sliding dictionary algorithm, analyzed first by Ziv and Lempel [14] and then by Storer and Szymanski [9] , among others. Incidentally, there are two distinct Ziv-Lempel LZ methods: sliding dictionary [14] and dynamic dictionary [15] in Storer's [8] terminology.
The LZW algorithm [12] belongs to the latter. These "characters" can be Huffman-coded, or better still, algebraically coded. There were several versions of LZARI ; some of them were slightly different from the above description.
As for compression ratio, Huffman cannot beat algebraic compression, but the difference turned out to be very small. After Prof. Other vendors began using similar techniques: sliding dictionary plus statistical compressions such as Huffman and Shannon-Fano.
I wondered why they used Shannon-Fano rather than Huffman which is guaranteed to compress tighter than Shannon-Fano. As it turned out, a then-popular book on compression published in U. Because LHarc was based on dynamic Huffman, it had to update Huffman tree every time it received a character. Yoshi and I tried other dynamic Huffman algorithms [5] , [10] , [11] , but improvements were not as great as we desired.
Traditional static Huffman coding algorithm first scans the input file to count character distribution, then builds Huffman tree and encodes the file. In my approach, the input file is read only once. It is first compressed by a sliding dictionary method like LZARI and LHarc , and at the same time the distributions of the "characters" see above and positions are counted.
The output of this process is stored in main memory. When the buffer in memory is full or the input is exhausted , the Huffman trees are constructed, and the half-processed content of the buffer is actually compressed and output. In static Huffman, the Huffman tree must be stored in the compressed file. In the traditional approach this information consumes hundreds of bytes.
My approach was to standardize Huffman trees so that 1 each left subtree is no deeper than its right counterpart, and 2 the leaves at the same level are sorted in ascending order. In this way the Huffman tree can be uniquely specified by the lengths of the codewords. Moreover, the resulting table is again compressed by the same Huffman algorithm.
To make the decoding program simpler, the Huffman tree is adjusted so that the codeword lengths do not exceed 16 bits. Since this adjusting is rarely needed, the algorithm is made very simple.
It does not create optimal length-limited Huffman trees; see e. Incidentally, my early program had a bug here, which was quickly pointed out and corrected by Yoshi.
After completing my algorithm, I learned that Brent [3] also used a sliding dictionary plus Huffman coding. His method, SLH , is simple and elegant, but since it doesn't find the most recent longest match, the distribution of match position becomes flat.
This makes the second-stage Huffman compression less efficient. On the basis of these new algorithms, Yoshi began to rewrite his LHarc , but it took him so long remember he was a busy doctor! My archiver was quite recklessly named 'ar'.
Actually I appended version numbers as in 'ar' for version 0. I should have named it 'har' after my name , say, because 'ar' collides with the name of UNIX's archiver. This is the reason 'ar' lacked many bells and whistles necessary for a real archiver. Yoshi finally showed us his new archiver written in C. It was tentatively named LHx. He then rewrote the main logic in assembler. Yoshi and I wrote an article describing his new archiver, which would be named LH , in the January, , issue of "C Magazine" in Japanese.
The suffix 'arc' of LHarc was deliberately dropped because the people who sold ARC did not want others to use the name. Then we learned that for the new DOS 5. Patent 4,,, Mar. Actually they got three patents! Furthermore, I learned that the original Ziv-Lempel compression method Eastman et al. Are algorithms patentable? See [16]. If these patents should turn out to be taken seriously, all compression programs now in use may infringe some of these patents.
Luckily, not all claims made by those algorithm patents seems to be valid. The foregoing is a slight modification of what I wrote in The year was a very busy year for me. In , I joined the faculty of Matsusaka University. This opportunity should have given me more free time, but as it turned out I got ever busier.
I stopped hacking on my compression algorithms; so did Yoshi. Luckily, all good things in LHA were taken over, and all bad things abandoned, by the new great archiver zip and the compression tool gzip. I admire the efforts of Jean-loup Gailly and others. We exchanged a lot of ideas. I think they are faked versions of LHarc. PKWare later developed its own file format, which became immensely popular when the file format was put into the public domain , and BBS users began boycotting the ARC program and using PK programs.
ARC had become popular amongst BBS users, who were paying large amounts of money to transfer files across by today's standards painfully slow modems. Any improvement meant money in users' pockets, and tighter compression would mean smaller files, which whipped across the POTS much faster. It's shareware status and simplicity of use were vital to the everyday user, and businesses, recognising its power and versatility began using it to archive data for better storage necessary in the days of very expensive hard drives.
Phil was writing in assembly language , and used the best possible algorithms and the processor's features where appropriate. He was still using the LZW algorithm, and whilst he had made improvements in compression and storage techniques, had still not 'invented' compression.
Phil's Zip algorithm was also used in gzip , and remains one of the most the most ported utilities. When we were doing the original IBM PC -- and consider this was a brand new hardware and software design -- it was hanging all the time," Bradley says. The only option engineers had to continue the work was to turn off the computer and start it again.
But back in those early days, the need to reboot "would happen a lot," Bradley says. After that, end users got used to it, and the rest is, well, history. Bradley laughs when recalling the joke. Nowadays, Microsoft Windows intercepts the Control-Alt-Delete key combination and displays a pop-up window that allows users to shut down the PC or shows what programs are running.
Bradley muses that it's funny "that I got famous for this, when I did so many other nifty and difficult things. But like it or not, his place in computer history as the father of the three-finger salute is here to stay.
DOS then formed the core of what became Microsoft's Windows software, a flagship product that has stoked the Redmond, Wash. But more importantly, he added, the firm satisfies his undying need to tinker. Paterson had developed DOS while working at Seattle Computer Products, and took a job with Microsoft around the time it bought the operating system, helping "tune and spruce" what became known as "MS-DOS" in its first iterations, he said.
Paterson would work sporadically for Microsoft for the next 17 years on various products. By the early s, however, Paterson recalls thinking: "Wow, there're million copies of this thing out there, and I wrote it originally. Cherry said that DOS also begat companions to Windows -- Microsoft's Office applications for word processing and other tasks.
Office emerged, Cherry added, simply because the advent of computers with an operating system meant "people needed something to do with them. Running the firm out of his house, Paterson said that he's enamored with improving such widespread, existing technologies. He is in the middle of a production run of of his "translators" for digital-video recorders, and reports to be pleased with the consistent demand for them.
Paterson describes his relationship with Gates during his Microsoft days as mostly unaffected by his creation of DOS. Later, Paterson's only contact with Gates occurred during the regular product presentations to the chief executive, something required of all Microsoft units.
Gates transitioned from Microsoft's chief to the role of chairman in Paterson said that Gates has had a unique impact on Microsoft as a technologically adept executive able to see the big picture for the company's business direction. A Microsoft spokesman said that such reviews will necessarily have to be limited as Gates steps back. Upon rejoining Microsoft, Paterson said he was shown that had he stayed on with the company after his initial hiring in the early s, his stock options would already have made him a millionaire.
Such flat-fee arrangements were common then, he said, and it was Microsoft's additional engineering strength that ultimately made the product so valuable. Paterson filed a defamation suit against Evans and his publisher in Looking to "simplify" his life recently, Paterson shed all of his directly owned stocks, including those in Microsoft.
He allows that he still likely owns some stock in the company through mutual funds, though he doesn't pay any special interest to mentions of Microsoft in the news. I set to work writing an operating system OS for the bit Intel microprocessor in April of At that point my employer, Seattle Computer Products SCP , had been shipping their computer system which I had designed for about 6 months. At one point we were expecting it to be available at the end of SCP wanted to be a hardware company, not a software company.
This would be a big design job that would take time to get right — but we were already shipping our computer system and needed an OS now. So I proposed to start with a "quick and dirty" OS that would eventually be thrown away.
I spent the next year tentatively in graduate school while also working at SCP. So I didn't have much experience in the computer industry. This is not to say I had no experience at all. I made my own peripherals for it, which included a Qume daisy-wheel printer with its own Zbased controller that I designed and programmed myself. I also designed my own Zbased road-rally computer that used a 9-inch CRT video display mounted in the glove box of my car.
And my school work included writing a multitasking OS for the Z80 microprocessor as a term project. The thrust of that project had been to demonstrate preemptive multitasking with synchronization between tasks.
For SCP, my work had been ostensibly hardware-oriented. But the CPU had required me to develop software tools including an assembler the most basic tool for programming a new processor and a debugger.
These tools shipped with the product. My hands-on experience with operating systems was limited to those I had used on microcomputers. I had never used a "big computer" OS at all. All programming projects in high school and college were submitted on punched cards.
Hardware Performance On the few computer systems I personally used, I had experienced a range of disk system performance. North Star DOS did the best job possible with its hardware. Each track had 10 sectors of bytes each, and it could read those 10 sectors consecutively into memory without interruption. To read in a file of 8KB would require reading 32 sectors; the nature of the file system ensured they were contiguous, so the data would be found on four consecutive tracks.
When stepping from track to track, it would have missed the start of the new track and had to wait for it to spin around again. That would mean it would take a total of 6. The 5-inch disk turned at 5 revolutions per second so the total time would be less than 1. It used an interleave factor of 6, meaning it would read every sixth sector. The five-sector gap between reads presumably allowed for processing time.
The 8-inch disk turned at 6 revolutions per second so the total time would be over 2. This is more than twice as long as the North Star system which used fundamentally slower hardware. Reading a single disk sector required five separate requests, and only one sector could be requested at a time. I don't know if all five were needed for every Read if, say, the disk or memory address were the same.
Only a single request was needed to read disk data, and that request could be for any number of sectors. Some aftermarket add-on hard disks were available with interleave factor as low as 3. In I founded Falcon Technology which made the first PC hard disk system that required no interleave. Once hard disks started having built-in memory, interleave was completely forgotten. I was concerned that SCP might have trouble persuading authors of application software to put in the effort to create a DOS version of their programs.
Few people had bought SCP's bit computer, so the installed base was small. Without the applications, there wouldn't be many users, and without the users, there wouldn't be many applications. My hope was that by making it as easy as possible to port existing 8-bit applications to our bit computer, we would get more programmers to take the plunge.
My first blog entry explains this in more detail. This required me to create a very specific Application Program Interface that implemented the translation compatibility.
I did not consider this the primary API — there was, in fact, another API more suited to the bit world and that had more capabilities. I myself took advantage of translation compatibility. I put them through the translator and came up with bit programs that ran under DOS. But I don't think anyone else ever took advantage of this process. It stands alone as the catalyst that launched the microcomputer industry as a successful business.
Without those signs of success, companies like IBM wouldn't have been tempted to enter the business and fuel its growth. The Significance of the BIOS Interface The concept of a software interface between separate programs or program components has been around for a long, long time.
The most obvious example is the Application Program Interface API that all operating systems provide to application programs. But interfaces can exist at other levels, not just between the OS and applications. Interfaces are used whenever two components must interact, but the components are developed independently or may need to be changed independently.
John Wharton: "Gary's most profound contribution"; Harold Evans: "truly revolutionary", "a phenomenal advance"; Tom Rolander: "supreme accomplishment", "originator of that layering of the software".
Certainly the idea of these layers of interfaces was not new to the computer industry. For example, UNIX like all operating systems provided an API for application programs, and connected to the hardware with an interface to low-level device driver software. So I guess in distilling down the essence of an OS so it would fit on a microcomputer, we can give Kildall credit for not distilling out too much and leaving out the interface layers.
Except that he actually did, originally. But the advantages that became obvious to him had been just as visible to his predecessors. It, of course, had a low-level interface so it could be tailored to work with any hardware. This is where my experience with interface layers began. The new idea was simply that you could do it — you could actually put a general-purpose operating system on a microcomputer. It worked and the market loved it. DOS was built on top of this general groundwork.
Kildall distilled the OS to a minimal, useful set of capabilities that would fit on a microcomputer. This simplified my task in setting the functionality of DOS. I guess this may refer to the fact that DOS has the same interface layers. Or it may refer to the similar function set. It demonstrated that it was possible to pare down the giant operating systems of mainframes and minicomputers into an OS that provided the essential functionality of a general-purpose Application Program Interface, while leaving enough memory left for applications to run.
This was a radical idea. In that sense, since Kildall picked an appropriate set of functions, any subsequent microcomputer OS would have the same ones and would be some sort of "knockoff". Posted by Tim Paterson at PM. The case was dismissed last week shortly before it was to go to trial. The main reason this happened is because the judge ruled that I am a "limited purpose public figure.
The API is how an application program such as a word processor asks the operating system to perform a task, such as to read or write a disk file. And the internal workings of DOS are quite different. This is not true, and it doesn't even make sense. What do I mean by "implement the same API"? Every operating system has basic functions like reading and writing disk files.
The API defines the exact details of how to make it happen and what the results are. The very same sequence would also open a file in DOS, while, say, UNIX, did not use function code 15, character file names, or "Call 5" to open a file. DOS was only for bit computers based on the Intel microprocessor. While 8-bit programs could not run on bit computers, Intel documented how the original software developer could mechanically translate an 8-bit program into a bit program.
Only the developer of the program with possession of the source code could make this translation. While DRI was free to sue for copyright infringement, the likely success of such action is still controversial at best. There are experts who say no, and there are experts who say maybe, but from what I can tell as of there is yet to be a successful finalized case. Lichtman provided a handful of case citations to show "that the issues here are not clear in either direction, and, in a hypothetical copyright suit filed by Kildall, he might have won and he might have lost.
New Century Mortgage. We didn't think it really applied, and as a preliminary injunction it hadn't gone through a full trial, let alone appeal.
I would suppose this is the best he had. Hollaar said to me "I feel that it's clear from the cases under similar circumstances that you would have won. But in the end that would have made absolutely no difference. No one ever used or cared about translation compatibility. I had been wrong to think it was a valuable feature. Lichtman mentioned to me that he was working for SCO in their lawsuits against Linux. An overlooked court case in Seattle has helped restore the reputation of the late computer pioneer Gary Kildall.
Last week, a Judge dismissed a defamation law suit brought by Tim Paterson, who sold a computer operating system to Microsoft in , against journalist and author Sir Harold Evans and his publisher Little Brown. And it needs it fast.
Enter Tim Paterson, programmer at a small Tukwila hardware shop, Seattle Computer Products, and known by Paul Allen to have already written an operating system for a bit processor. But the story of Tim Paterson, now in the eighth year of his current stint at Microsoft, is not as familiar. He squirms, for instance, at the implication that he's fixated on his authorship of DOS.
He holds up a recent profile in Forbes, contrived as a first-person account. Then there's that title. Besides," he laughs, "there's enough people who think it's nothing to be proud of. He figures his place in history is due to timing. And necessity.
0コメント