You also seem to think most Linux devs work for free.
Okay, not most but, no doubt, many of them. But it doesn't really matter. The thing that matters is that there are a lot of them. Again - why? Why did such a project like Linux, with the goal of creating an open-source OS based on 20-year-old ideas, gain so strong support?
Simple answer: because they don't seem to think those ideas are obsolete. Why would anyone want to work right now in 20 years old ideas, such as Windows NT ones? Because they fit their needs alright.
Not 3.11, I suppose. You went down too far.
However, the system architecture of Windows 8.1 still uses the same ideas that were developed for Windows NT 4.0. And I find working in very old Windows 2000 (yeah, NT 4.0 is too old) much more comfortable than in modern Ubuntu 14.04. (However, I'm not using 14-year-old OSes now.)
A lot of things changed since Windows 2000.
It is. However, it defines its environment, which has to be GNU or at least GNU-like.
No, it hasn't. See Android, Tizen and several other smartphone OSes that I forgot the names of.
But Linux itself, as a kernel, has at least one unpleasant design characteristic. Have you ever thought why Linux supports much-much-much fewer devices that Windows? And even if it supports a device, it does it a little bit
more poorly. And that's the biggest problem.
I noticed it. Have you ever noticed how Windows comes installed with computers, and thus not supporting their devices on Windows means a no-sale? There you got why Linux has poorer support. It's not because of technical obsoleteness, it's because of lack of man power and official support from the manufacturers. For example, NVIDIA binary drivers *for Linux* can run faster in some situations than Windows 8.1 ones (you can look for the benchmarks on Phoronix).
Let's go further. If you want to run a program in Windows, you only have to get an exe file and execute it. In the most cases. And in the most cases it will run in Windows 8.1, Windows 8, Windows 7, and even in Windows XP.
That's nothing to do with Linux itself, it's related to userspace not keeping binary compatibility. It's not old ideas, it's that MS gets out of their way to keep compatibility. The faster development is, the harder this becomes. Kudos to MS, but it has nothing to do with the software design. You can also just provide a binary for Linux, even without taking into account ABI compatibility: you can just statically link or provide your own libraries (the latter is a common practice on Windows, too).
And you don't have to compile the program. Just run. And it will run.
Same thing. You don't need to compile programs (well, not the user, in both cases SOMEONE has to). Distributions are the ones to take care of that. Take a look at Steam on Linux. Valve didn't release the source code for Steam, so building is not even an option. They did create a bundle of libraries, so your binary games can work targetting them, so you don't need to care about binary compatibility, and you can run your game the same on Ubuntu or Gentoo (two pole apart distributions), it's called the Steam Runtime.
But if your OS is Unix-based, you have to compile the program. If you don't want to compile it by yourself, you should find the compiled version of it, that is compiled for your Linux distribution exactly.
False. Point one, directly false. Point two, half-true, but same degree of truth as in Windows. You need the binaries to target the symbols exposed by the local libraries, or you could provide your own copy of the libraries. This has nothing to do with it being a unixoid.
Well, it might work if, for example, you try to run a program compiled for Debian 6 on your Ubuntu, but most likely it won't. Have you heard of so called "dll hell"? Well, Linux doesn't have "Dll hell". It has just hell.
Again, API/ABI unstability has nothing to do with the UNIX design. It has more to do with the nature of software itself: everyone does as they want, and if they disagree, then their software becomes incompatible. The only difference Windows has here is it has a central control for the OS itself, where the library versions and the symbols exposed can be defined for everyone once. If someone cares about doing the same in the Linux world, they can adhere to freedesktop.org standards.
That's right! The driver model! Windows has a driver model. It was worked out in the days of the first Windows NT and it's still here. And it's been proved it's a very smart and productive idea. Its success was already evident in just a couple of years after creation. Does Linux have a driver model? Does Linux use drivers? When you buy a super sound card what is the first thing you do? You download its driver. But if you use Linux? Well, in this case you may want to get a refund, because the recent Linux kernel 3.14 can't use your very expensive device at all. And probably never will.
With Linux, in most cases, you either use an open source driver, which in turn comes with the kernel (that's why you don't download it), or you download a closed source one if the manufacturer supports it. Having a driver model? I can at least name three. Also, you can't blame lack of drivers on the OS: Windows doesn't produce the drivers for most of the devices it runs, the manufacturers does. It is again unrelated to the technical side of things, and now related to commercial side of things: is it worthy for the manufacturer to support Linux? Well, it always depends. If the piece of hardware is for servers, then it is, as Linux is widely used for servers, so you are likely to get clients paying big sums for it. On desktop, it is worthy for some, it is unworthy for some others.
I talk about the same thing. Linux came first. Horses also came first. Then why do we use cars now?
The fact it came first also means it has more momentum. As I argued in another thread, for a lot of use cases it is not really relevant which base design you follow, but to have something working. They got it to a working state, and they see no reason to drop it.
When I say "Linux" I mean all of those Unix-like distributions that mostly use Linux as the kernel and GNU as the userspace. Debian and all Debian-based, RedHat, SUSE and so on. The main idea that Linux took from the 1970s is monolithic kernel. That's why it's so difficult to make Linux to support devices.
Linux supports modules, and it's not harder because of that. I don't have experience in driver development, but using a microkernel, as far as I understand, would mean you need to do a lot of message passing for drivers. That's actually harder than just calling a common API on the kernel that does what you want.
I also can mention its POSIX infrastructure
That's too wide a subject. What specifically is problematic about that?
its security model that only allows to set one user and one group to a file (and the user has to be the owner of the file).
I hope Z98 corrects me if I'm wrong here, but isn't Windows moving in that direction, too?
And what about its marvelous concept that any object in the system is a file?
For a start, it's been a long time since *everything* was actually a file. And again, what's wrong with that? That's just a consistent metaphor for different kind of objects.
Their XWindow System
But, wait, XWindow is modern - it's from the 1980s, not the 1970s. (And, yes, I know that X.org server updates every month. I also can feed my horse every day, but it won't turn into Ferrari because of that.)
It's funny that I mentioned *twice* that it is obsolete and *not even liked* by most of the Linux community, and they are on their way to kill it. You seem to also assume the ideas in the Linux world are kind of stagnant. Like they don't ever argue on the files metaphors, or believe X Windows is state of the art. Get your facts straight. You can hate Linux all you want, as it is a matter of preference, but coming to "teach" when you evidently don't know what you talk about is just wrong.
with the idea of using network sockets even locally is soooo modern... They even invented Unix sockets for local communications, because even if they work local they still want to use sockets.
I don't know about that, but Z98 seems to know better and says Windows uses those...
Why not? What about Windows Server Core? However, it's not really console-only Windows, because you still can run GUI applications on it, despite the fact they are not initially installed.
Then it is not *only-console*, and comes with the same userspace APIs you need to run GUI apps, which is clearly against the example I gave. And, well, why not? Because it is called Windows because of the windows. Wouldn't have guessed that one, right?
Bluebee wrote:... maybe there are just OS'es enough already to fill all the needs, and the needs ROS would fill in, is already occupied by MS (certainly in regard to support) where price is of secundary importance to big companies.
That's certainly a possibility. In technical terms, I think it is true, in fact. But I like Windows technically (while I dislike some of the faces of MS), and like open source and free software, so it comes natural to support ReactOS, even when the technical niche is filled. Still, I don't think it'll keep filled, since MS seems to want to eventually go full on tablet OS for everyone (with it's forwards and it's backs), and they'll eventually drop support for Windows 7 (IMO, the best Windows in terms of desktop UI).
There is a great need for a working OS which is able to run XP programs, because companies have to pay millions of dollars to migrate to another OS than XP, whose support by Microsoft has been said ended.
Doesn't most or virtually all XP programs run on Windows 7? The way I understand it, is not the need for a working OS with the ability to run XP programs (I never had a problem to run even Win95 programs on Windows 7), but a matter with the migration costs (not only in terms of actual hardware and software, but the uptime and such) and the learning costs.
But when starting this search at comp.os.minix the first post on top of was "LINUX is obsolete" - starting in 1992, the last post was recently. The first post says "LINUX is a monolithic style system. This is a giant step back into the 1970s" and: "The alternative is a microkernel-based system, in which most of the OS runs as separate processes, mostly outside the kernel. They communicate by message passing. The kernel's job is to handle the message passing, interrupt handling, low-level process management, and possibly the I/O. Examples of this design are the RC4000, Amoeba, Chorus, Mach, and ... Windows/NT."
But then again, you are basing on a claim made by:
A) a microkernel die-hard as Minix author.
B) someone who seems to believe UNIX is still the right way, but is just against monolithic models.
AFAIK, both Linux and Windows can be considered hybrid kernels, in the sense they can't be tagged by the arbitrary definitions of micro or monolithic kernels (a monolithic kernel shouldn't have modules, for example, while a microkernel shouldn't include drivers in itself, and some other things, IIRC).
It's fine to have issues with Linux. I have plenty. But if you're going to try to argue against Linux, get your facts straight. Linux's supporters are going to be a lot less civil than I am if you try arguing with them in ignorance.
Hey, even while I like ReactOS, I do support Linux (and I use it far more than I use Windows), and I believe I'm being civil...
I agree that a lot of the community are pretty harsh (I guess something related to the elitism of most of them), but extending those "perks" to all of us is unpolite.