Author: Jem Matzan
specific tasks done. To me, the command line was a necessary tool as
well as a last resort — if all else failed, I knew I could count on a
command line program to fix the problem. I already knew that I could do
pretty much anything from the command line if I was willing to sit down,
read manual pages, and learn — or if I really had to. To prove
it, recently I forced myself to use only the CLI for a week. I ended up
learning a lot more than just a few command line arguments.
I decided to use OpenBSD 3.5 on
my Dell Inspiron 3800 latptop for this exercise instead of Debian,
Slackware, or Gentoo Linux because, in addition to knowing that OpenBSD
worked perfectly on the laptop’s hardware, I wanted to use this
opportunity to get to know OpenBSD a little better. Also, OpenBSD has a
feature that I thought would make my week easier: the Ports software database.
Ports makes it easy to find and install programs from the command line
— easier for me than Debian’s APT or Gentoo’s Portage systems. I didn’t
know what programs I’d be using, so it was important to have a decent
selection of mail, IRC, and other programs to choose from. OpenBSD’s
Ports tree, while nowhere near as extensive as FreeBSD’s, was more than
adequate; one could even say that it was designed with CLI use in
mind.
I had the advantage of working with people who primarily use the CLI
for everything they do. I asked them for some recommendations and
occasionally for some help, but I didn’t ask to be walked through the
whole experience — that would invalidate the exercise.
Software
The one program that I needed right off the bat was a competent FTP
client. I was used to using the graphical gFTP, which is about as good as they
come in the world of X11, so whatever I ended up with in the CLI had a
lot to live up to. The standard BSD FTP program that’s included in
nearly every operating system is great for transferring single files,
but it’s horrible for uploading or downloading large groups of files.
Brian Jones, author of Linux.com’s SysAdmin to SysAdmin column,
turned me on to NcFTP, which I found
to be an excellent alternative to the old BSD FTP. Configuration was
pretty simple, and I was able to upload and download multiple files in a
queue and save bookmarks to oft-used FTP sites.
Much like the FTP situation, I knew I could use the standard
interface to ircii if
absolutely necessary, but I’d prefer something more like the excellent
XChat2 graphical IRC client. I tried
Epic, but found it to be rather
unremarkable. Digging deeper into Epic’s configuration docs, I
discovered that there are many pre-made configuration scripts for it.
After looking at several of them, I settled on DarkStar; while it wasn’t
everything that I wanted, it worked better than what I’d had previously.
I used Epic and DarkStar for the first two work days of my week in the
CLI.
The biggest challenge I had was with email. I was used to Evolution,
which runs my personal and business contacts, calendar, and email. I
didn’t mess with calendar or contacts programs because I didn’t think
I’d need them for the week, but I did explore text-based email
programs.
BSD Mail comes standard with OpenBSD, but its operation and
configuration were a mystery. I could read local email, but I ran into
difficulty trying to configure it to receive POP3 mail. Even if I’d
gotten it to work perfectly I’d have to take notes on all of the
shortcut keys because there are no on-screen helper menus; you’re left
at a prompt with a list of messages. I wasn’t against learning how to
use it, but with so many other programs to have to remember key commands
for, it seemed like a better idea to simply try a different email
client.
I set up Mutt to receive POP3
email from one of my accounts; I have eight POP3 email accounts that I
need to collect all in one inbox. The Mutt configuration file is
gigantic; there are so many settings that it took an hour just to read
through each option and its description. The manual page is equally
gargantuan. It’s nice to have a lot of features, but if I have to crawl
through pages and pages of options that I’ll never use just so I can
learn how to do the simplest of tasks, I can safely declare the program
useless to me.
I tried Pine, but that
was just the opposite of Mutt — the interface was so simple that I
couldn’t figure out how to get multiple accounts into it. I could hardly
configure it for one account, even after reading the
instructions.
The trouble with the email programs that I tried is that they are
designed primarily for two kinds of people: those with one email account
who need a bonanza of options that deal with mailing lists, and those
who are concerned only with local system or LAN mail. In other words,
they were made for developers and sysadmins.
I was considering giving up, but a friend suggested that I install Fetchmail and use it to send
all of my POP3 email to my local username. That’s exactly what I did,
and it was a lot easier than I thought it would be. Reading the
Fetchmail manual page was less daunting a task than I’d anticipated, and
I had Fetchmail retrieving all of my POP3 mail inside of an hour. I
created a config file that would regularly retrieve mail from all eight
accounts and then pass it all on to my local user account. This cut out
all configuration difficulties that I’d had with my email programs
because all three could receive local mail by default. Mutt didn’t seem
so useless to me after that. My only complaints about Fetchmail are that
I never figured out how to set it to leave all mail on the server unless
it had already been downloaded, and I couldn’t get it to delete mail
from the server that I deleted locally. I like to save all non-spam
email as a reference.
The vi editor comes standard with pretty much every Unix, BSD, and
GNU/Linux operating system. If you know how to use it, it’s a powerful
and convenient editor to have; if you don’t know how to use it, it’s an
archaic, non-intuitive piece of garbage. My previous opinion of vi
tended toward the latter, but I figured that now was the time to learn
how to use it properly. Instead of vi, I installed vim from Ports
because I wanted to take advantage of its syntax highlighting and
automatic indent features, among other various improvements. The first
thing I did was to run the vim tutorial to learn vim’s finer points. I
got most of the way through it before I learned everything I needed to
know to work on HTML documents and configuration files. Since I already
know how to insert, delete, save, and quit, the supplementary commands
that I learned were easy to remember. Of all of the programs I used in
my week in the CLI, I spent the most effort learning to better use vim.
I considered using Emacs — in fact it was my first choice, because I
knew slightly more about it than I did about vi, but I couldn’t get a
non-X version installed through Ports. Emacs is also not quite as
universal as vi is, and I wanted my learning to apply to as many systems
as possible.
I’d heard of GNU
Screen, but I didn’t think I’d ever need to use it. I couldn’t
imagine needing more virtual terminals than what were provided by
default. On the other hand, I didn’t anticipate working from the command
line full-time, and it didn’t take me long to figure out that it was
worth the effort to install and learn Screen. I was amazed at how easy
it was — all I needed to know was how to start Screen, how to open new
terminals within it, and how to switch between them. It took five
minutes to read through the help file, and soon afterward I had my IRC
client and Mutt running in TTY1, two instances of vim open in TTY2, and
four or five instances of Lynx open in TTY3. TTY4 was for other command
line tasks, should they become necessary. This remained my preferred
configuration for the rest of the week.
I used Lynx as my browser; I
don’t really like Lynx, but what else is there? By the time I got to
learning Lynx, I was struggling to remember all of the command keys for
all of the programs I’d installed. I was so disgusted by the fact that
the Web was no longer graphical that I avoided Web activities as much as
possible.
Lastly, I used cplay
to play Ogg and MP3 files while I worked. It’s the one program that I
didn’t need to read the documentation for, and it didn’t need a
mile-long config file either.
Putting it all into production
I spent the weekend setting all of this up, experimenting with
alternative programs, and messing with config files. I didn’t make a
whole lot of progress before Monday, and needless to say I didn’t get
much work done at first. Well, actually, I didn’t get much work done all
week because of the snags I ran into.
The most difficult part about using the CLI for production desktop
work is using the Web. To me, the World Wide Web was made to be
a graphical experience, and it should always be that way. Once you’ve
gotten used to Mozilla, you can’t really switch to Lynx and enjoy it.
Lynx is purely for the retrieval of necessary information — you use it
when you really need to read something (like documentation) on the Web
and you can’t get to X. My colleague David “cdlu” Graham tipped me off to a
console graphics display package called zgv which would allow Lynx to
show pictures in Web pages. Unfortunately it doesn’t seem to be in
OpenBSD’s Ports tree, so even if I’d known about it, it would have been
difficult or impossible to install.
One trouble I ran into with IRC was the inability to scroll up
through a conversation, something that I rely on in XChat2 to get me up
to speed. The best I could do is look through the log with the
/lastlog
command and hope that my search string didn’t
match more lines than I had screen space. When I mentioned the problem,
David Graham gently chastised me for not increasing my screen size from
the default (VT100 emulation, which also didn’t support color) to
something higher. I hadn’t even thought about changing the resolution;
ever since the dark days before Linux (and Windows) I was used to the
standard DOS 80×24 screen and figured that it was just the way things
were outside of the GUI.
About the only program I never had any problems with was vim. It
always did what I needed it to do, from writing articles (the first half
of this article was written in vim from my OpenBSD machine) to creating
and editing configuration files. It won’t replace Bluefish for my Web
work, but it was certainly one of the most important programs for me to
have. Sure, vi or the BSD ee editor could have done some or all of the
same work, but they wouldn’t have been as nice to use. Syntax
highlighting is practically a necessity once you’re used to using
it.
At some point I switched from DarkStar/Epic to irssi, a Perl-based IRC client. while it
didn’t have the nice auto-notification of incoming email that Epic had,
it was easier to use and configure, and it had a number of other
features (such as timestamping by default) that I enjoyed. It’s not that
I couldn’t eventually program Epic to do what I wanted it to do; the
issue was that I couldn’t do it inside of an hour. I don’t want to read
a 50-page manual and Google my questions for hours just to do a few
simple tasks. I’d rather have a program that simply works the way I want
it to.
Of all the things I’ve lost, I miss my mouse the
most
It didn’t occur to me to use the mouse in the CLI — as far as I was
concerned, the mouse was a tool meant for graphical environments. I
didn’t use a mouse until Windows 3.0 (except in some DOS games) and I
didn’t anticipate needing one in a BSD terminal. Since I couldn’t scroll
any windows with it and couldn’t use it to switch between programs, I
didn’t notice its absence at first.
The trouble was, I needed to copy and paste URLs from Lynx to vim,
and there seemed to be no way to do it without highlighting the address
with a mouse and then pasting it into another terminal by clicking both
mouse buttons. A laptop’s built-in touchpad or directional button is not
well-suited to this task. I didn’t realize how often I used the mouse
until it was gone.
Without being able to easily add links to articles, I was unable to
complete any of my work. I did write the text portions, adding in href
tags where the links would need to go. I then emailed the articles to
myself and when I got home, I added the appropriate links in Bluefish. I
guess you could say that I cheated, but I had to get some work done.
Keyboard navigation wasn’t all that difficult to get used to,
especially after I had my GNU Screen setup properly organized. I could
switch back and forth between different terminals rapidly, and scroll up
and down documents in vim without forgetting to leave or enter insert
mode. Although I reached a certain level of proficiency, I don’t think
that any amount of experience in the CLI could have replaced the
convenience of mouse navigation in GNOME.
Home sweet GUI home
While I had fun learning new programs and new tricks, I missed all of
the functionality and finesse of GNOME. I don’t know whether it’s just
that the programs I can use in X are easier to configure, use, and
multitask, or whether I’ve lost my love for the command line.
I started out my computing life in the CP/M and MS-DOS command line
interfaces, and was dragged kicking and screaming into Windows 95 some
years later. At the time I thought of the GUI as a memory-hogging,
inconvenient, impossible to configure, buggy, and in general useless
layer on top of a simple CLI environment that I knew and enjoyed.
Somehow I thought that going back to a more powerful and flexible
command line years later would rekindle my desire for a return to the
DOS days.
My DOS machine did everything I needed it to do. If I had a new
program or game, I knew how to install it and run it, and all
configuration was usually done either from within the program itself, or
from an external configuration program. There were no config files to
hack, and I never once had to read the documentation to figure out how
to use a DOS program. In other words, they were designed sensibly, with
the user in mind.
Many of the BSD and GNU programs that I used during my week in the
CLI, especially all of the mail and IRC programs, almost seemed designed
to be archaic and difficult to use by default. I would have found my
week more enjoyable with programs that were easier to configure, or at
least easier to find relevant information in the manual pages or help
files. I have no problem with reading documentation, except when it’s
dozens of pages long and hides the most critical and basic options and
information amid a big steaming pile of superfluities.
Program design — and of course my own ignorance of such advantages
as Linux terminal emulation, higher terminal resolutions, and the zgv
terminal graphics viewer — aside, there’s also an underlying roadblock
to get past when switching to the CLI: it’s a whole different way of
thinking. Using a GUI makes you think in a broader, more synthetic, and
object-oriented way. It’s not just that the programs look prettier and
have more convenient mouse-driven features — it’s that you
think of your programs and data in terms of how they look on
the screen. In the command line you’re forced to think in a more
analytical way; you have to picture where your data is, where you want
it to go, and how you want to view or manipulate it. The GUI tends to
make you think in terms of programs and what they can enable you to do
with your data; the CLI tends to make you think in terms of data and
what you need to do with it. The GUI causes you to ask yourself how
something looks, whereas you’re using a more kinesthetic sense
when working in the CLI. I’ve come to believe that it is not just a
terminal interface, it’s a whole different kind of user environment, and
it’s not for everyone.
Using the CLI for a week seemed to me like a survival retreat in the
remote wilderness. It was fun and challenging, but I’m glad to be back
home where I can go back to what works best for me. I’ll definitely
revisit the command line as a user environment, but only at my leisure
— not while I’m supposed to be getting work done. I encourage readers
to try this out on their home machine — if not for a week, then at
least for a few days. Hopefully you’ll find my experiences and advice to
be a good starting point in your own CLI adventures.
Jem Matzan is the author of three books, editor in chief of The Jem Report, and a contributing editor for OSTG