Trying out Arch Linux again
Published on 2013-10-16.
I have previously written a small article called Trying out Arch Linux, first in 2008, then revisited in 2009, approximately a year later. At the time of this writing (2013) I haven't touched Arch Linux in four years. I then decided to re-test it and write some notes. In the previous notes I compared Arch Linux to Debian GNU/Linux (and to some extend FreeBSD and OpenBSD), I will again compare it to Debian GNU/Linux, but will not mention the BSD flavors. In these notes I will not address the problems about Arch Linux mirrors not being in sync. The problems however remains.
Until the 2011.08.19 release Arch Linux provided the Arch Installation Framework (AIF) which was a dialog based interactive installation script. The default install was a minimalist base system. Further system customization and expansion (adding a window manager, desktop environment, etc.) had to be done manually, installing packages downloaded from online repositories. However, the AIF has been removed since the 2012.07.15 release due to lack of maintainers and instead has been replaced by a simple command line script called "pacstrap".
I agree that the AIF was a "holding hands installation script", but at the same time it has become more time consuming to install Arch Linux. However, because of the simple way Arch Linux is setup, you can easily create your own installation script if need be.
On Debian, and on the previous releases of Arch Linux, you can have the same minimalist base system up and running in one tenth of the time it takes to get it up and running on Arch Linux without the AIF. However, one may argue, with truth to it, that since the AIF is gone the user learns more about how to configure a basic Linux system - and this is important and in perfect alignment with the Arch Linux philosophy.
In my experience the main reason why some people keep going back to using a less secure and third grade operating system is because they where depending on to much "hand-holding" from the onset. They learn nothing. If something goes wrong, which usually is the users own fault in the first place, they don't know what to do and have no experience in dealing with the system.
Back in the days when I was trying out Linux for the very first time, a good friend of mine, a highly skilled "über-geek" and the very person who recommended Linux to me, said:
If you want to become really good at Linux, delete Microsoft Windows and start using Linux exclusively!
This was the best advice I could have gotten and once I did that I quickly learned all the basic stuff and moved on hence forth to the more advanced stuff.
Another good friend of mine hate using Microsoft Windows and much prefer to use Linux, yet for the past 15 years or so he keeps returning to Microsoft Windows. After half a year or so, once his Windows computer is filled with viruses and has become really slow, and has presented him with the occasional "blue screen of death", he curses Windows and begins to use Linux again - then the process starts all over. The main reason why he eventually returns to Windows is because he always use a Linux distribution that resembles Microsoft Windows. Many such distributions has a lot of GUI tools, abstraction layers so to speak, which while they seem to make administration easier, they actually often end up breaking things leaving the user completely unaware of where to look in order to solve the problem.
If you use a network manager, and for some reason there is a problem with your network configuration, where do you look in order to solve the problem? Looking at the GUI network manager, which might have caused the problem in the first place, isn't going to solve the problem.
In systems without hand-holding scripts or GUI configuration tools you may spend some extra time setting up your system, but that is really time very well spent because if you ever experience any problems, which you seldom do because you have set things up manually and right in the first place, then you know where to look and how to solve it.
In the Debian installer you can choose the expert mode. In expert mode you have to decide everything yourself and you can manually setup a lot of things, however the installer is still very user friendly and you can still choose between graphical or text based installation. Even when you choose text based it still uses dialogs and you are hence not exposed to any console work like in the current Arch Linux distribution. However, I much prefer the Arch Linux way today.
When you install Arch Linux you have to manually setup the network using a text editor and editing a configuration file in
/etc/. When you install Debian, whether you use the normal install or the expert install, GUI or text based, you never get exposed to the files in
On Arch Linux you learn about the internals of the system while you're setting the system up. On the other hand, when it comes to Arch Linux, you at least need a printout of the installation manual (or another computer with access to their wiki) which you need to follow step by step.
The Debian installer is cool and it is extremely easy to use, even when run in expert mode, however you'll have to familiarize yourself with the internals of the system after the installation if you want to understand where things reside and how they work. This means that even when you choose the expert installation, you still don't know what is really going on. This is the Debian way, the way it is intended to be in Debian. Once you have the system up and running, you can decide how you want to spend your time.
Even a newbie coming to GNU/Linux for the very first time can setup Debian without any prior experience. And an advanced user can choose the expert install in order to control every detail of the installation. On Arch Linux however, if all you've got is the installation medium, and no other computer to access the Internet, and you haven't done this a lot of times, you can easily get lost.
Installing Arch Linux is not difficult, everything is extremely well documented, and the Arch Wiki is one of the best pieces of documentation in the world of Linux distributions, you just have to follow every step meticulously and understand what you're doing.
Arch Linux has no stable snapshots, it's a rolling release. The rolling release model allows one-time installation and continuous seamless upgrades, without ever having to reinstall or perform elaborate system upgrades from one version to the next. Besides from using a rolling release model, Arch Linux also uses bleeding edge software. The rolling release model has some benefits, but bleeding edge software potentially also has some problems. If you need to run a stable system, where you need your software to work, all the time and every time, then a rolling release with bleeding edge software can cause you problems if an update might break your current setup because upstream has changed configuration etc.
Debian GNU/Linux comes in three flavors.
- Stable: This is the production release of Debian. It is a stable snapshot. A stable branch in the world of Debian (and many other systems) roughly means that at some point in time you freeze everything up and decides what packages goes into the stable branch. Once that is decided, often the decision is based upon tests of stability, you make sure that every dependency is met correctly and without errors, and you make sure that every security problem is fixed in each package, then you provide some patches to make everything play nicely together, and you then release the system. Afterwards you only provide security upgrades and really important upgrades that wont brake anything on any of the supported architectures (please do consider that Debian supports a high number of architectures).
- Testing: The testing distribution is a rolling release like Arch Linux, but unlike Arch Linux that gets new software as soon as upstream has them available (bleeding edge), software in the Debian testing release still undergoes some degree of testing before it is accepted. Because Debian supports a high number of architectures all packages must be in sync on all architectures where they have been built and must not have dependencies that make them uninstallable. They also have to have fewer release-critical bugs than the versions currently in unstable. That way the testing release is always close to being a release candidate for stable.
- Unstable: The unstable distribution is also a rolling release like Arch Linux and it is the release candidate that perhaps resembles Arch Linux the most. The unstable distribution gets new software as soon as upstream has them available too. The unstable distribution is where active development of Debian occurs.
In Arch Linux there is no periods of upgrade silence as Arch has no feature freeze. However, as a stable operating system suitable for servers and/or desktop machines, where the user wants to use his machine as apposed to spending time fiddling with upstream changes, the Debian stable distribution is one of the best Linux distribution in the world.
On Arch Linux, because it is a rolling release with focus on bleeding edge software, you experience both. You get the latests features, but you also get the latest bugs.
On Debian this problem is solved with "backports". Backports are packages taken from Debian testing, adjusted and recompiled for usage on Debian stable. In some cases, usually for security updates, backports are also created from the Debian unstable distribution.
Backports cannot be tested as extensively as Debian stable, and backports are provided on an as-is basis, with risk of incompatibilities with other components in Debian stable. Backports should therefore still be used with care. It is therefore recommended to only select single backported packages that fit your needs, and not use all available backports.
One could argue that since bleeding edge software is the newest and latest release it also contains all the latest bug-fixes, hence it is as secure and stable as it can be.
While it is true that bleeding edge software contains all the latest bug-fixes it also contains all the new features, features that hasn't been extensively tested yet, features that introduce new bugs into the code.
Frozen software doesn't contain any of the new features, but it still receive all the latest bug-fixes, which are backported and patched into the code.
Hence, while frozen software lacks the latest new features, it only becomes more stable and secure as time passes.
Unless you really truly needs bleeding edge software, regardless of what operating system you're running, running a stable system is in most cases much preferred.
Using a rolling release doesn't automatically mean you get bleeding edge software. The rolling release simply means that the distribution runs without freezes. Updates and upgrades gets pushed out continuously.
A distribution that freezes can also release bleeding edge software, which is actually what Ubuntu does, as it is based upon the Debian unstable distribution. This however makes little sense and Ubuntu has, as a result, become a Linux distribution famous for its many instabilities.
Both Debian and Ubuntu have a period of stopping the influx of changes for a period before release (freezing) and focusing on fixing bugs. The main difference regarding the issue of stable vs. unstable packages is that Debian has a rather long stabilization process, where Ubuntu's process is shorter. Debian freezes for a much longer time and tries much harder to fix all the bugs.
When you run the stable branch of the Debian project you are sure that your system will never break due to an upgrade. Only security upgrades and bug fixes are introduced into the stable branch and this is especially valued by system administrators. On most other operating systems you haven't got that insurance, and often when you update your system, you find that things stop working because the update changed the system. This doesn't happen on the Debian stable branch and this is where Debian is really valued.
Some people complain about that the packages that come with Debian stable are too old, and from a version point of view that is true, but Debian isn't about releasing the latest bleeding edge software, Debian is about releasing a stable system that you can depend upon. A system that works, where packages doesn't break due to dependency issues or upgrade issues.
Besides from the above, the packages that goes into the Debian stable branch are all packages that have been tested a lot and that have gotten more bugs fixed.
So who is bleeding edge software suitable for? Who should run a distribution like Arch Linux or Debian unstable? Software developers, software testers (people who test software and actually submit bug reports), and people who just don't care and who want all the latest features.
This is exactly what is stated on the Debian website about the unstable distribution:
The 'unstable' distribution is where active development of Debian occurs. Generally, this distribution is run by developers and those who like to live on the edge.