Tag Archives: Linux

Linux related posts, most likely about Ubuntu.

Why are there too many Linux Distros

Linux (/ˈlinəks/) is An open-source version of the UNIX operating system. It’s kind of a big deal for Servers Mainframes and Super computers.

More than 90% of today’s super computers run on a some kind of Linux. Also many small devices run on Linux even if you can’t tell they do… for example the Android OS is based on Linux.

Linux is great, it’s powering most of the webservers for your favorite websites, since it’s free, highly customizable and very reliable.

That’s great, but why are there way too many Linux distros or distributions?

To answer this question we have to define the difference between Linux and other operating systems (Windows or Mac OS). Any operating system comes with a preinstalled set of applications, a user-interface and built-in drivers. For Windows, Microsoft decides what software they’ll have built-in to Windows and how the Window Manager (The piece of software that organizes the display of windows and dialogs on your computer) will work for this specific version of windows. Microsoft has the one and only decision in this, and there’s only a limited number of customizations we can do to the Window Manager (The Look & Feel of Windows), that Microsoft allows us to change, but those changes we have to do after installing windows.

For example, if I wanted my Microsoft Windows 9 to have WinAmp installed on it out of the box, I’ll have to request Microsoft to do this. They will probably say No, since Windows has that wonderful media player (That I never use). I also cannot ask them to change the color of menus from white to grey just because I like it better that way. Same story for drivers, I don’t use printers, I don’t need Windows to have HP generic printer driver because I have a canon driver but that’s so selfish.

Now comes Linux.. Linux is a kernel (the bridge between applications and actual data processing) and a bunch of free software packages. The flexibility of the OS allows me to choose anything I want and to customize it the way I want. As a programmer, I can fetch the code of any software that my Linux is using, modify it as I wish. Then rebuild it and replace the one installed on my computer.

I can add some new packages, remove others and modify packages as much as I wish. All that is left to do is create an installation disk (or ISO image) for my current OS and distribute is the way I wish (Of course there’s a bunch of licenses that I have to read first).

As software keeps updating to add new features or fix bugs, my OS will have to keep up with the packages I’ve modified or created to keep my OS up-to-date or otherwise it’ll be discontinued just like many other Linux distros. And if my OS was found great by some developers, they’ll help me with ideas, testing and code writing to keep up and add features to my own software.

This answers the question “How is a Linux Distro created” not “Why are there so many of them?”.

As I’ve explained, I’ll most likely base my distro on an existing one to ease the integration of software packages into my OS. I’ll also have to find a team that shares my vision and finds the OS I’m creating great in order to help me develop and maintain it. It’s not that easy to maintain and develop an OS.

There are few basic Linux distros that most other ditros are based on. e.g. “Slackware Linux”, “Redhat”. you check out this Linux Distros Timeline to see how many distros have ever been developed

Reigning Pwn2Own champion: "The main thing is not to install Flash!"

Here are the highlights from Miller's interview:

He thinks Windows 7 will prove more secure than OS X Snow Leopard this year, in part because it doesn't have Java and Flash enabled by default. Windows' full ASLR (address space layout randomization) also gives it a security advantage.

When asked what he thought would make the safest OS and browser combo, he opted for Chrome or IE8 on Windows 7, with no Flash installed, although 'there probably isn't enough difference between the browsers to get worked up about.'

For my money, the juiciest quote from the interview was 'The main thing is not to install Flash!'

On the mobile side, Miller guessed that the iPhone 3GS would be more easily exploitable than the Motorola Droid, mainly because the iPhone's been around longer, and has been subjected to more extensive security research.

You can check out Miller's full answers (in English or Italian!) at OneITSecurity.

The Best and the Worst Tech of the Decade

With only a few weeks left until we close out the 'naughts and move into the teens, it's almost obligatory to take a look back at the best and not-so-best of the last decade. With that in mind, I polled the O'Reilly editors, authors, Friends, and a number of industry movers and shakers to gather nominations. I then tossed them in the trash and made up my own compiled them together and looked for trends and common threads. So here then, in no particular order, are the best and the worst that the decade had to offer.

The Best

AJAX - It's hard to remember what life was like before Asynchronous JavaScript and XML came along, so I'll prod your memory. It was boring. Web 1.0 consisted of a lot of static web pages, where every mouse click was a round trip to the web server. If you wanted rich content, you had to embed a Java applet in the page, and pray that the client browser supported it.

Without the advent of AJAX, we wouldn't have Web 2.0, GMail, or most of the other cloud-based web applications. Flash is still popular, but especially with HTML 5 on the way, even functionality that formerly required a RIA like Flash or Silverlight can now be accomplished with AJAX.

Twitter - When they first started, blogs were just what they said, web logs. In other words, a journal of interesting web sites that the author had encountered. These days, blogs are more like platforms for rants, opinions, essays, and anything else on the writer's mind. Then along came Twitter. Sure, people like to find out what J-Lo had for dinner, but the real power of the 140 character dynamo is that it has brought about a resurgence of real web logging. The most useful tweets consist of a Tiny URL and a little bit of context. Combine that with the use of Twitter to send out real time notices about everything from breaking news to the current specials at the corner restaurant, and it's easy to see why Twitter has become a dominant player.

Ubiquitous WiFi: I want you to imagine you're on the road in the mid-90s. You get to your hotel room, and plop your laptop on the table. Then you get out your handy RJ-11 cord, and check to see if the hotel phone has a data jack (most didn't), or if you'll have to unplug the phone entirely. Then you'd look up the local number for your ISP, and have your laptop dial it, so you could suck down your e-mail at an anemic 56K.

Now, of course, WiFi is everywhere. You may end up having to pay for it, but fast Internet connectivity is available everywhere from your local McDonalds to your hotel room to an airport terminal. Of course, this is not without its downsides, since unsecured WiFi access points have led to all sorts of security headaches, and using an open access point is a risky proposition unless your antivirus software is up to date, but on the whole, ubiquitous WiFi has made the world a much more connected place.

Phones Get Smarter: In the late 90s, we started to see the first personal digital assistants emerge, but this has been the decade when the PDA and the cell phone got married and had a baby called the smartphone. Palm got the ball rolling with the Treos about the same time that Windows Mobile started appearing on phones, and RIM's Blackberry put functional phones in the hands of business, but it was Apple that took the ball and ran for the touchdown with the iPhone. You can argue if the droid is better than the 3GS or the Pre, but the original iPhone was the game-changer that showed what a smartphone really could do, including the business model of the App Store,

The next convergence is likely to be with Netbooks, as more and more of the mini-laptops come with 3G service integrated in them, and VoIP services such as Skype continue to eat into both landline and cellular business.

The Maker Culture: There's always been a DIY underground, covering everything from Ham radio to photography to model railroading. But the level of cool has taken a noticeable uptick this decade, as cheap digital technology has given DIY a kick in the pants. The Arduino lets anyone embed control capabilities into just about anything you can imagine, amateur PCB board fabrication has gone from a messy kitchen sink operation to a click-and-upload-your-design purchase, and the 3D printer is turning the Star Trek replicator into a reality.

Manufacturers cringe in fear as enterprising geeks dig out their screwdrivers. The conventional wisdom was that as electronics got more complex, the 'no user serviceable parts' mentality would spell the end of consumer experimentation. But instead, the fact that everything is turning into a computer meant that you could take a device meant for one thing, and reprogram it to do something else. Don't like your digital camera's software? Install your own! Turn your DVR into a Linux server.

Meanwhile, shows like Mythbusters and events like Maker Faire have shown that hacking hardware can grab the public's interest, especially if there are explosions involved.

Open Source Goes Mainstream: Quick! Name 5 open source pieces of software you might have had on your computer in 1999. Don't worry I'll wait...

How about today? Firefox is an easy candidate, as are Open Office, Chrome, Audacity, Eclipse (if you're a developer), Blender, VLC, and many others. Many netbooks now ship with Linux as the underlying OS. Open Source has gone from a rebel movement to part of the establishment, and when you combine increasing end user adoption with the massive amounts of FLOSS you find on the server side, it can be argued that it is the 800 pound Gorilla now.

As Gandhi said, 'First they ignore you, then they laugh at you, then they fight you, then you win.' When even Microsoft is releasing Open Source code, you know that you're somewhere between the fight and win stages.

Bountiful Resources: 56K modems, 20MB hard drives, 640K of RAM, 2 MHz processors. You don't have to go far back in time for all of these to represent the state of the art. Now, of course, you would have more than that in a good toaster...

Moore's Law continues to drive technology innovation at a breakneck pace, and it seems that related technologies like storage capacity and bandwidth are trying to follow the same curve. Consider that AT&T users gripe about the iPhone's 5GB/month bandwidth cap, a limit that would have taken 10 solid days of transferring to achieve with a dialup connection.

My iPhone has 3,200 times the storage of the first hard drive I ever owned, and the graphics card on my Mac Pro has 16,000 times the memory of my first computer. We can now do amazing things in the palm of our hands, things that would have seemed like science fiction in 1999.

The Worst

SOAP: The software industry has been trying to solve the problem of making different pieces of software talk to each other since the first time there were two programs on a network, and they still haven't gotten it right. RPC, CORBA, EJB, and now SOAP now litter the graveyard of failed protocol stacks.

SOAP was a particularly egregious failure, because it was sold so heavily as the final solution to the interoperatibility problem. The catch, of course, was that no two vendors implemented the stack quite the same way, with the result that getting a .NET SOAP client to talk to a Java server could be a nightmare. Add in poorly spec'd out components such as web service security, and SOAP became useless in many cases. And the WSDL files that define SOAP endpoints are unreadable and impossible to generate by hand (well, not impossible, but unpleasant in the extreme.)

Is it any wonder that SOAP drove many developers into the waiting arms of more useable data exchange formats such as JSON?

Intellectual Property Wars: How much wasted energy has been spent this decade by one group of people trying to keep another group from doing something with their intellectual property, or property they claim was theirs? DMCA takedowns, Sony's Rootkit debacle, the RIAA suing grandmothers, SCO, patent trolls, 09F911029D74E35BD84156C5635688C0, Kindles erasing books, deep packet inspection, Three Strikes laws, the list goes on and on and on...

At the end of the day, the movie industry just had their best year ever, Lady Gaga seems to be doing just fine and Miley Cyrus isn't going hungry, and even the big players in the industry are getting fed up sufficiently with the Trolls to want patent reform. The iTunes store is selling a boatload of music, in spite of abandoning DRM, so clearly people will continue to pay for music, even if they can copy it from a friend.

Unfortunately, neither the RIAA nor the MPAA is going gently into that good night. If anything, the pressure to create onerous legislation has increased in the past year. Whether this is a last gasp or a retrenchment will only be answered in time.

The Cult of Scrum: If Agile is the teachings of Jesus, Scrum is every abuse ever perpetrated in his name. In many ways, Scrum as practiced in most companies today is the antithesis of Agile, a heavy, dogmatic methodology that blindly follows a checklist of 'best practices' that some consultant convinced the management to follow.

Endless retrospectives and sprint planning sessions don't mean squat if the stakeholders never attend them, and too many allegedly Agile projects end up looking a lot like Waterfall projects in the end. If companies won't really buy into the idea that you can't control all three variables at once, calling your process Agile won't do anything but drive your engineers nuts.

The Workplace Becomes Ubiquitous: What's the first thing you do when you get home at night? Check your work email? Or maybe you got a call before you even got home. The dark side of all that bandwidth and mobile technology we enjoy today is that you can never truly escape being available, at least until the last bar drops off your phone (or you shut the darn thing off!)

The line between the workplace and the rest of your life is rapidly disappearing. When you add in overseas outsourcing, you may find yourself responding to an email at 11 at night from your team in Bangalore. Work and leisure is blurring together into a gray mélange of existence. 'Do you live to work, or work to live,' is becoming a meaningless question, because there's no difference.

Windows 7 already bigger than Snow Leopard and Linux combined

It's only been a couple of weeks since Windows 7 was released, but Microsoft's new OS has already captured a larger percentage of the market than Apple's OS X 10.6 Snow Leopard and Linux (yes, all of Linux). This doesn't come as a huge surprise, considering how many Windows users were clamoring for Win7 after the flop that is Vista. Microsoft says Windows 7's launch outdid Vista's by 234%. Those brisk sales have already netted Windows a 2% share of the world's OS business, compared to just over 1% for Snow Leopard, and just under 1% for Linux.

Despite the strong sales of Win7, Windows as a whole dropped a quarter of a percentage point in October, with Mac and Linux both making small gains. That quarter of a point hardly matters when you've got 90% of the OS market and your new operating system is being adopted quickly, though.

I expect to see Windows swing back up after Windows 7's been available for a while. I mean, we're talking about an operating system that outsold Harry Potter in the UK. Right now, it's only got a 2% share, compared to 19% for Vista and 70% for XP, but that's after only two weeks. Expect that number to zoom upward by the end of November.