Any attempt to move away from Windows or MacOS into Linux is immediately intimidating. The first thing anyone runs into is the fact that there is no canonical1 place to get Linux, or even a good list of all available or viable options to choose from. There is even a debate around what the term 'Linux' is supposed to mean.
Here, I am going to attempt to provide some general and non-technical Linux tips and guidelines for someone wanting to have a computer that 'just works' in the same manner you might expect of Windows or MacOS. It is basically autobiographical - I started using Linux for work some fifteen years ago and really don't enjoy needing to tinker to get things working. These days I only use Windows in a gaming VM and haven't had to tinker with anything in at least 5 years, probably longer.
One thing to keep in mind when reading: Since there is basically no consensus on anything ever in Linux-land, none of what I write can be considered absolute. There will be corners of the Linux world doing things very differently from what I describe here. But again, the point here is to get as close to 'turnkey' as possible, which means staying to the mainstream, so I will ignore this variability fro the sake of brevity. Whenever I use absolute terms, feel free to mentally replace them with the level of caveats you feel appropriate.
Especially when coming from Windows but also compared to MacOS, things just work differently on Linux. You must embrace this if you don't want to spend a huge amount of time fighting the underlying assumptions of your new platform. This is really exactly the same as any other change of platform - MacOS and Windows work very differently from one another, just as phones differ from computers, etc. However, something tends to make people believe that they will have the best Linux experience if they try to make it emulate whatever they came from, and this will bite you if you try.
It is easy to forget that most of us have grown up with computers in some form, most usually Windows computers. This means decades of training in The Windows Way. Expecting to not have to spend time learning new things when switching to a whole new platform with zero decades of prior experience is just straight up silly.
There is also an unfortunate amount of disinformation and straight up falsehoods about what Linux is like floating around on the internet. It's no conspiracy, mostly just old memes and half-remembered anecdotes about people's Linux experiences from the 90s. This is not to say that there are no rough edges because there certainly are and probably will be for the foreseeable future, but tales of Linux horror should be taken with large grains of salt, especially when they are presented without context.
Accept that you are moving to a new platform, with its own assumptions and ways of doing things. There will be a learning curve.
Contrary to popular belief, most Linux distributions2 are actually set up to make life as streamlined as possible in everyday use. This does not mean that you will be immediately familiar with how it works (see above), but it does mean that there is usually a method to the madness, and fighting that (perceived) madness will only bring further pain.
One of the core streamlining tools of any Linux distribution is the package manager. This is the way to install, remove and update all parts of the environment - from kernel and drivers to your favourite desktop clock app. This one was particularly hard for me, since I was used to curating a set of hand-picked applications pulled from all corners of the internet to make my computer behave just the way i liked it. Attempting this, especially when new to Linux, will make life harder.
A side-effect of the repository approach is that the very newest versions of software take a bit to become available. The timing depends on the distribution, but most have some delay. It's a fact of life, and trying to manually work around things to get the newest shiny will make things very hard, very fast.
As usual there are exceptions, currently most prominent among them probably being Steam. As a platform with a stated ambition of taking Linux gaming mainstream, Valve do a lot of work to make the Steam experience be as streamlined as possible, and hence installing and maintaining Steam the way Valve prefers (which is unfortunately completely separate from distribution package managers) works just fine - it behaves more or less identically to its Windows counterpart.
Allow the OS to work the way it wants to. Don't try to force your own perception of how things should work on the OS, and your experience will be significantly smoother.
Since vanishingly few hardware vendors officially support Linux, most real-life problems one might run into when it comes to using Linux boil down to 'has someone done the work to get this hardware working yet?'. Now, this situation is much better these days than it has ever been - AMD products run on a fully open API, Nvidia drivers are mostly shipped in official repos3 and Intel has been actively contributing drivers for their hardware to the Linux kernel for over a decade now. But the devil is, as is tradition, in the details.
Unless Linux support is officially stated by the vendor, as is the case with some Dell laptops and everything from Framework to name a few, expecting a smooth experience on a laptop launched less than six months to a year ago is more or less insanity. Very common things to have issues with are sleep states, biometric scanners, display brightness settings, keyboard back-lighting, and similar "peripheral" functions that aren't absolutely core to the functioning of the device per se but nonetheless important aspects of the overall experience with the device. When looking at truly new models, even getting WiFi working might be a major hassle.
When it comes to laptops, outside officially supported models it is important to use a popular and not brand-new model if you want to have a smooth experience.
Desktops are significantly less finicky with their hardware - desktop components are already largely expected to have to work in arbitrary combinations, forcing them to use standards that Linux can leverage to make things Just Work. The rule about brand new launches still stands though, and here the crop of officially supported devices is even thinner. Anything older than 6 months backed by a solid brand should work without any issues. Very rarely you may run into trouble with dodgy WiFi or bluetooth chips, but unlike on a laptop it's probably perfectly workable to just get a cheap USB dongle to replace the troublesome device, plug it into the back of the computer and forget it exists.
In short, the places where one might expect problems is peripheral hardware not supplied by Intel, AMD or Nvidia, that is less than 6 months old.
Don't expect a bleeding-edge or very niche laptop to work well on Linux. Buy slightly older models and research what others have experienced. Desktop computers and parts work fine the vast majority of the time.
This may seem slightly contradictory when I just spent two sections talking about how to embrace limitations, but a Linux system provides its user with a huge amount of control over their own user experience as compared to other OSes. The major difference in the kinds of freedom provided by Windows/MacOS vs Linux is that Windows allows you to install anything in any way on top of the fairly large and fixed foundation of what the vendor ships, while Linux allows you to swap out components of the shipped product, while expecting you to then use those components within their designed limits. A reasonable parallel is the walled garden approach of Apple products - the package manager being the 'garden', with significantly lower walls.
In practice this means that you probably won't be managing your hard drive as much on Linux as you might on Windows ("I put all my games in this folder, and work applications go here, ..."), while you might instead put more time into finding a desktop environment that you find enjoyable.
This modularity has been a core aspect of Linux more or less since its inception, which means that trying out new things, swapping them around and, crucially, going back, is pretty simple. Given the right precautions (see the technical points below), even switching entire distributions can be fairly trivial.
Use your freedom. Try different desktop environments, email clients, launchers, what have you. But do it within the confines of whatever framework your distribution provides you with.
We need to talk about the terminal. The scary black window with the blinking cursor that has been the centrepiece of cautionary tales to unruly children for decades. It's true, you will encounter the terminal when using Linux. It's as unavoidable as encountering the registry when using Windows. What seems to be very poorly understood though, is why exactly that is.
I'm writing this article in a terminal window. I've connected via SSH to my web server in the cloud and am editing this file in emacs. I won't have to save my work on my home computer and upload it via an FTP client, I'm just writing where the file needs to end up. This is a core part of the power of the terminal - when the only expectation is to be able to take text input and provide text output, so many things become directly accessible without needing specially written software to support it. I could manage my home router the same way, with no need for a web interface.
The terminal is an immensely powerful tool, because it is a whole alternative environment which which to interact with a computer. Until the early 90s, it was functionally the only way anyone4 interacted with a computer. Because of the nature of a terminal, where every action is a line of text, it is also an incredibly simple way to share instructions between people. If you need to help someone change a setting in Windows, you have to describe how to navigate the GUI to get there. Instructions to do the same thing using a terminal would amount to "copy and paste these lines". From here the concept of a shell5 script comes quite naturally - "I do these 5 steps really often, I'll put them in a file so I can run that one file instead."
The downside of this power is that discoverability - the ability to find out how to accomplish something without being actively instructed by another person - is absolutely terrible in a black window that only provides you with a blinking cursor. There are quite a number of creature comforts that improve this situation in a modern terminal, but they really don't help the novice all that much. The terminal is scary and overwhelming for a reason, and GUIs do a lot of things better when it comes to day to day use of a computer.
That doesn't take away from the fact that sometimes there really is no good substitute for using the terminal, and learning the very basics - how to navigate the filesystem, how to edit a file and a handful of core commands - is absolutely indispensable if you want to make your Linux experience as smooth as possible.
To me this is the core difference in approach between Windows/MacOS and Linux - the others view having to resort to the terminal as a failure, while Linux accepts that some tasks are just best suited for the terminal. Highly esoteric things that one might do once in a lifetime do not merit an entire GUI if they can be solved by copying and pasting one cryptic line into a terminal window.
Don't be afraid of the terminal, it's just another tool. And don't be surprised if you barely encounter it unless you seek it out.
Unless you actively do otherwise, Linux applications will place your files in your home directory. Not only files you create yourself get saved there, but also the configurations for every single application you ever use. This means that as long as your home directory survives, you can very easily return to the same state you're used to in any application, even if you completely reinstalled your computer. This is actually true in Windows as well, but migrating your home directory there is pretty tricky.
Because of this, keeping your home directory (or in more general
terms, /home) on a separate partition from the rest of the system is
highly recommended. This way you can very easily reinstall your
computer or even switch distributions, and Firefox will behave just the
same way under Mint as under Arch, with no extra work required from
you. It also makes backups easier (you do have backups, right?).
Coming from the above recommendation of keeping your home directory on its own partition, the question immediately arises how much space to dedicate to each partition? When using normal partitions this is an annoying task - make the wrong choices and you suddenly end up having to reinstall your entire computer because you ran out of space in one or the other. This is where LVM comes in.
LVM is Logical Volume Management, and abstracts partitions away from the physical drive so they are more easily changed around later. This is similar to RAID, where multiple disks are treated as a single disk. LVM can be backed by more than one disk, but the real benefit is that partitions can be grown on the fly. So when you install, you put your whole hard drive into LVM and create fairly minimal partitions on top of it. Then when your games library grows you grow your home partition, and when you install some huge desktop environment you grow the root partition. You don't have to make the size decisions immediately on install but rather when you find out what you need.
One immensely cool thing about LVM is also that you can migrate your OS between two harddrives without downtime. You can add a second drive, move your partitions to the new drive and remove the old one, all without even rebooting.
The even more futuristic technology in a similar vein is ZFS, which unifies RAID, LVM and filesystem into a single unit. It's still pretty experimental in Linux though.
Installing an application on Windows basically amounts to dumping some files in a directory and getting some shortcuts to those files on your desktop. On Linux (as sometimes on MacOS, due to their shared Unix roots), installing an application means placing files in lots of different places across the filesystem, which by virtue of those locations make the application discoverable to the rest of the OS.
When installing something manually, that is without using the package manager, the installation process tends to look a lot more Windows-y - just a bunch of files unpacked somewhere. When doing it like this, the rest of the OS won't know about the application, setting you up for annoyances and moving you away from the turnkey life.
There are ways to make this approach integrate reasonably well with the rest of the OS, but that is beyond the scope of this document and certainly falls outside of 'just works' territory.
Essential commands: ls, cd, less, grep, locate
man - short for manual, this will provide you with documentation
for commands. man less will provide instructions on how to use
the less command. I suggest starting there, since manual pages
are displayed using less and you probably want to find out how to
exit it. Of course you can just google what you need to get done,
but that way you don't get any 'drive-by' knowledge along the
way. I promise you'll be googling other things anyway.
Use the tab key all the time. Tab completions help save on typing and they can give essential hints as to how to use a command. When in doubt, tab.
sudo - (possibly) short for 'system user do' allows you to do
administrative things. Do not run anything involving sudo
without understanding what it does, it could do anything up to
and including bricking your computer. And I mean bricking, not just
breaking the OS.
When reading terminal output: Read from the bottom upwards. Output in the terminal will tend to place its most important points right at the end, for the simple reason that that's the most visible spot. If the last line says something like "Everything's fine" then you can ignore the 3000 previous lines and go on with your life.
But there is Canonical, purveyors of Ubuntu, just to make things even more confusing. ↩
A distribution is what one might term a 'complete' Linux OS. It is what you might download to a USB stick in order to install the OS. As is quickly becoming a theme, this term is not very straightforward to define and I will leave more thorough discussion to other places on the internet. ↩
Short for repositories, the general term for the place(s) a package manager downloads its packages from. ↩
Yes, yes, Windows was first launched in 1985. Please go find someone who actually used it before we start that fight. ↩
Kind of a synonym for terminal, but technically not really. It's fine though. ↩