Productivity; sweet productivity!

Three things.

WordPress FTP Madness

WordPress does not need you to have an FTP server. If you’ve been trying to update things and it keeps asking for your FTP server, and then you’re thinking “but I own the box it’s hosted on and that’s where my files are.. why can’t I just update? Why do I need FTP?” Then you’re just like me about 15 minutes ago. What you need to do is change the owner of your wordpress directory (and all subdirs and files) to the same as the user that your apache webserver runs as. This will probably be ‘www-data’ if you’re on Ubuntu. This problem comes up because Wordress, running in your webserver, doesn’t have the ability to write to certain directories it needs to – so it asks you for FTP access to them, which it thinks would be a reasonable way to go about it. It probably is, in some cases, but not ours!

What did I do to fix it? I went to the directory above my blog folder (so, I was inside of ‘~/public_html/’ on my server) and I ran

chown -R www-data blog

This will CHange OWNership Recursively to www-data on the directory blog (and therefore, all of its subdirs and files). You can check what user your server runs as by going to /etc/apache2/envvars — you should be able to see it exporting APACHE_RUN_USER and APACHE_RUN_GROUP, probably as www-data.

Once I’d done this, I hit refresh and wordpress worked as desired! I can now update things and not grapple with FTP questions I don’t have answers to.

WordPress Syntax Highlighting

Now that everything worked so darn well, I went to this place and installed SyntaxHighlighterPlus. There’s some annoying things about it (I don’t really want to see line numbers, and the colour scheme is hideous) but it allows me to have code that’s worlds prettier in my posts.

Be careful about googling for configuration options! There’s a different SyntaxHighlighter out there, which this is just integrating into wordpress. The syntaxes used for inserting code into your blog are very different between the two and the non-wordpress one has a lot more options available within it. I may see what I can do to extend what I’ve got, or just find something else that does the trick better.

Pretty Permalinks

I’ve been frustrated by having ‘?p=##’ style links for a while now, and tried going into Settings->Permalinks and changing my format — but whenever I did it, wordpress would give 404 for the pretty link. The old link continued to work, but wordpress would not set pretty links for some reason. A bit of googling told me that you need to have apache’s mod_rewrite enabled in order to use pretty links! So I went to /etc/apache2/mods_enabled.. and mod_rewrite wasn’t there. That’s a good sign, means that is probably the cause. But how do I enable it? Google wasn’t helpful — most recent results were from over a year ago! Everyone was telling me to go edit files and touch them and such, and I know apache has some newfangled tool for enabling modules.. what was it? More googling. a2enmod. You should use a2enmod to enable mod_rewrite.

So, sudo a2enmod mod_rewrite.

Then… it still doesn’t work! That’s because of two things. First: You have to reload Apache’s configuration settings. You may have just been told by the a2enmod command to run ‘/etc/init.d/apache2 restart’, but that’s not a great idea because it actually restarts the process, which is ugly. Instead, run ‘/etc/init.d/apache2 reload’ (remember to use sudo) and it’ll cleanly reload the config files without a restart. The second reason is bigger: you’ve got another change to make. You’ve enabled the module, but you haven’t actually given apache the directives to allow it to make changes to your site’s URLs – so let’s do that. Go to /etc/apache2/sites-enabled/ and sudo vim (or gedit, nano, whatever) the symbolic link there, that leads to your website. It should look a bit like this:

ServerAdmin carss.w@gmail.com

DocumentRoot /home/wcarss/public_html
<Directory />
        Options FollowSymLinks
        AllowOverride None
</Directory>
<Directory /home/wcarss/public_html>
        Options Indexes FollowSymLinks MultiViews
        AllowOverride None
        Order allow,deny
        allow from all
</Directory>

ScriptAlias /cgi-bin/ /usr/lib/cgi-bin/
        <Directory "/usr/lib/cgi-bin">
        AllowOverride None
        Options +ExecCGI -MultiViews +SymLinksIfOwnerMatch
        Order allow,deny
        Allow from all
</Directory>

Change the ‘AllowOverride’ directives in the first two Directory entries (the blank one and the one for your site’s document root) from ‘AllowOverride None’ to ‘AllowOverride all’. The result should be:

ServerAdmin carss.w@gmail.com

DocumentRoot /home/wcarss/public_html
<Directory />
        Options FollowSymLinks
        AllowOverride all
</Directory>
<Directory /home/wcarss/public_html>
        Options Indexes FollowSymLinks MultiViews
        AllowOverride all
        Order allow,deny
        allow from all
</Directory>

ScriptAlias /cgi-bin/ /usr/lib/cgi-bin/
        <Directory "/usr/lib/cgi-bin">
        AllowOverride None
        Options +ExecCGI -MultiViews +SymLinksIfOwnerMatch
        Order allow,deny
        Allow from all
</Directory>

That’s all there is to change. Now you’ve got to run ‘/etc/init.d/apache2 reload’ once more, and then try our pretty permalinks. Should work dandy!

Conclusion

That was a fun detour and I’ve got a prettier, more functional blog for it; but it certainly didn’t advance my 4chan scraper, get any of my 3000 paper written, advange my 1500 marker, or even get my overdue philosophy paper done. Alas, you can’t have everything at once. 🙂

Auto

Google has united the technologies behind the best competitors at the Grand Challenge, and crafted something amazing.

The thought of a self-driving car is about as revolutionary as a horseless carriage – but we’ve been expecting it in science fiction for a long time now. The advent of this technology in a near-to-commercializable and highly reliable version makes relevant a discussion of the possible effects.

Transport of Goods

Trucks have very low visibility. Truckers have to stop to eat, stop to use the washroom, stop to sleep, and can make mistakes. For all of this, they have to be highly trained and likewise, highly paid. Materials and objects need to be sent from one place to another, and always will. Google’s automatic cars would remedy nearly every one of these issues. Trucks could drive faster, with no breaks, without mistakes, without a per-truck hourly fee of a driver.

It seems likely that this is the first place where automation will strike. The trucking industry is entirely about moving goods, and any company willing to put vehicles onto the road that doesn’t have the cost of drivers, doesn’t lose packages to theft or accident, and that can send and receive faster than its competitors by virtue of not having pee breaks will outperform its competitors. Any failing competitor will need to change to the newer model, and the unions can’t do anything about it. They will become, sadly for them, irrelevant.

The chief obstacle is laws, which are hazy on whether or not a driver actually needs to be in a vehicle at the time it is being driven.

Transport of People, Public

Buses and Taxis are a huge amount of very serious business. Steps toward automation have already been taken – bus ticket purchases are largely done online, and there are systems like UberCab in SanFrancisco which allow you to automatically request the nearest black-car taxi to your location, provide the destination to the driver, show them approach on google maps, then confer payment through credit card and email you a receipt. No money changes hands, no talk is had with the driver if you don’t wish it. Everything is automatic. All that’s left to do is drop the requirement of having a driver.

Buses have a similar set of arguments to Trucks. Less need to stop, no one to pay, but most importantly: much higher visibility and no mistakes. Imagine a world where bus crashes didn’t happen. Wouldn’t it be great? A computer system can guarantee the best possible response to any situation — human eyes, reaction times, hands, and level of knowledge just can not compete with the sensibility of a properly designed automatic system.

Taxis are the same deal on a smaller scale, but consequently have a higher proportion of savings. More drivers for fewer people means that the economic pressure to cut drivers out of the mix is far higher than on buses. The risks associated with mistakes are similarly high – but perhaps a taxi driver’s ability to ignore particular rules of the road when they seem unimportant is a powerful part of a company’s ability to get people to places fast. Regardless, I think that taxi automation will come somewhere between second and third.

Transport of People, Private

Your city. Your neighbourhood. Your road. Your car. No drivers, just vehicles going places. This comes down to an extremely personal level and raises some troubling (and fun) questions. For example, let’s say you’re drunk. Should your car stop you from driving at all, and force you to let it handle everything? What if you’re not really drunk, but you’ve just had a /single/ drink? Further questions about the limitations of driver-control have to be asked — if studies end up showing that drivers are reliably worse than cars themselves, should we ever be allowed to drive? Maybe only in inclement weather, or maybe that’s a particular restriction! We can’t know a lot of these things yet, and it’s a scary thought that so much could hang in the balance.

Then there’s the question of bugs. What if something is wrong somewhere in the software? Certain types of software can be mathematically proven to be free of entire classes of error, and it would be marvellous if those sorts of techniques could be used here. If they cannot though, who will take responsibility for a failure? Will the engineer who wrote the code shoulder the blame for a bug found in a self-driving car’s software? What about the company they work for? Will the driver share blame for not overtaking the car when it clearly does something wrong? These are all valid questions. Perhaps an unintended consequence will be that the standardization of software creation, and the guarantees of safety and testing that have been called for so long in our industry will finally materialize, and software development will reach a renaissance.

😛

Synergies

There’s some cool thoughts that arise from all of these things together — other sorts of vehicle may end up being created which we’ve never conceived of and would be impossible without some kind of automated driver. Vehicles will be able to travel at far higher speeds, nearly everywhere. Traffic jams can, and will be erased from memory. Instead we’ll have streams of high speed, unmanned vehicles navigating our graph of roads, communicating to find the optimal solution. Two cars driving down a highway late at night side by side could cooperate if a deer emerged onto the road, one swerving in a mechanically perfect fashion while the other moved out of the way of the primarily affected vehicle. None of this is possible today, but certainly will be soon.

Conclusion

Could this change destroy our sense of freedom? Will it just add another failing component to the complex system that is the modern automobile, which some (but not many) people actually use? Maybe it’ll be good enough to use on sunny days in certain locales, but not everywhere. Time will tell. For now though, Google’s got their autos driving around San Francisco, and I think it’s very cool stuff. It’ll affect jobs, the economy, and our personal lives profoundly.

The origin of the Free Software Foundation and the Open Source movement

A New Hope

In 1969, Ken Thompson, and Dennis Ritchie put the lion’s share of drive and effort into creating the first version of the Unix operating system for a small pdp-7 they’d gotten a hold of at Bell labs. They wrote most of it in a GECOS (a GE operating system) cross-assembler for the dec pdp-7 that produced paper tapes which had to be carried to the other machine to be run. Dennis Ritchie has said “the essence of communal computing, as supplied by remote-access, time-shared machines, is not just to type programs into a terminal instead of a keypunch, but to encourage close communication.”

Some interesting language work got done too, which ended up being driven forward largely by the move to a PDP-11. Thompson first set out to write a Fortran, but as Ritchie says, “the intent to handle Fortran lasted about a week.” This effort became a mash of BCPL, spartan syntax, and the tiny space requirements (particularly on the PDP-7) which took the name B. A much expanded and improved version known as C was written on the PDP-11 itself, and the Unix operating system was slowly rewritten in C from the bottom up (excepting the assembler).

For the next ten years (and beyond), the Unix operating system blossomed through intense amounts of communication, sharing, and tinkering with the open system. Nearly all of the key contributions which made Unix a success came out of the openness, readability, and modularity of the system — particularly the early sharing of the system to Berkeley and the exploits of Bill Joy and Bob Fabry.

The Empire Strikes Back

In 1974, the United States congress reclassified computer programs as copyrightable material[10]. Whereas they had once been considered analogs to blueprints/constructions (as in, source/compilation), they were eventually classed as similar to literary works. This led to the use of licensing instead of sale to preserve First Sale Rights, and that’s where EULAs come from.

Unix had become very popular. It had been programmed in a high level language, which meant it lent itself very well to teaching Operating System design, and it was being constantly improved and upgraded by the cool folks at Berkely (and others as well – the source was just part of the package in those days. What had started as a research effort grew and flourished into a successful tool used by academics and businesses worldwide almost entirely through its openness and hacking.

So of course, AT&T felt it best commercialize and sell the system. They did their best job (with new copyright powers!) to see that the hacking around and openness got snubbed – particularly at Berkeley. The BSD had been fed largely by military money (it was the OS for nationwide upgrades to DARPAnet hardware), but after being left on its own just became a target for a copyright suit from Ma Bell.

The community largely collapsed as everyone began taking posts at large companies (they had changed the world, after all), and the younger generation was left alone wondering, “What gives?”

(Actually, AT&T probably isn’t that evil – but they didn’t want to let Berkeley make stuff based off of their own stuff; regardless of how okay it had been previously. This put BSD in the legal dumps as they tried to sort everything out with AT&T, and kind of killed BSD’s popularity.)

Return of the Jedi

Tux

"Tux"

This is where Richard Stallman comes in. He seems to have been peeved that he missed the boat, so he stood up and shouted that he was building his own boat; and now was the time to get in line. In October of 1985 he announced the creation of the Free Software Foundation, whose intention was to build a free Unix from scratch. Things really went very well, (they often do for the hardworking MIT type), and tools like gcc and emacs came out of his efforts. But he needed a kernel; the original had been written from scratch in C in 1973 and had undergone revision and improvement by the best minds of computer science for over a decade. All of it was copyrighted.

In late 1991, a university student happened to create a simplistic kernel for the 386, and after posting it to some MINIX related discussion boards (as it was based loosely on MINIX) and working for a few months, others began to help. Within a few years, the GPL’d Linux Kernel version 1.0 was available and integrated into a number of collections of GNU tools – an example of one such distribution is Debian.

Linux now ‘competes’ with less open variations of Unix, but for all intensive purposes, old Unix has been wiped out of relevance. True BSD and System V releases are unheard of, and OpenBSD, NetBSD, and FreeBSD (open source variations of the free but long lawsuit-encumbered Berkeley Software Distribution of the original Unix OS) are still around with tiny user bases, on the same open model of software as linux. A different variant, OpenSolaris, is dead (thanks a lot, Oracle), but the closed Solaris system exists widely. The closed Macintosh OSX is really just a revamping of NeXT, which was a proprietary Unix-like from the early 1990’s – it is the only popular surviving relative of Linux, and fortunately it internally still utilizes many of the same standards and tools.

Surrounding the GNU system (and later the GNU/Linux distributions) has been a tremendous amount of what Ritchie originally spoke of as being the important aspect of a remotely accessible, time-shared environment: a tight-knit community with high levels of communication has been forged around the use and modification of these systems. It is an open community, and the materials it works upon and processes by which it operates are transparent. While by no means perfect, it has continued to be a pinnacle of innovation and has fueled tremendous amounts of new software and is responsible for tools that run the vast amount of the business and internet infrastructure across the world.

Citations:

(it is 6:00 AM and I do not possess the mental capacity required to write these out in the correct formats — I’ll do my best to go back and inline them at relevant spots though)

  1. http://cm.bell-labs.com/cm/cs/who/dmr/hist.html – this is actually written by Dennis Ritchie. It’s a great read; please go look at this if you have any interest at all in the topic.
  2. http://www.dwheeler.com/secure-programs/Secure-Programs-HOWTO/history.html – this is a short Unix history in a HOWTO on someone’s web-based guide to secure unix programming. I think that this kind of community narrative which is being passed down through the generations is right there at the heart of what Ritchie meant.
  3. http://www.crackmonkey.org/unix.html – this guy seems to have glanced around a bit and done the same thing I did; he quotes the Ritchie paper and basically just restates it with a little bit of extra stuff.
  4. http://oreilly.com/catalog/opensources/book/kirkmck.html – a lot of really good information on the BSD side. This article touches on just how much of a champ Bill Joy is.
  5. http://www.gnu.org/gnu/thegnuproject.html
  6. http://www.gnu.org/gnu/gnu-history.html
  7. http://www.opensource.org/osd.html
  8. http://en.wikipedia.org/wiki/Software_copyright
  9. http://en.wikipedia.org/wiki/Linux_kernel#History
  10. http://en.wikipedia.org/wiki/History_of_Linux
  11. Lemley, Menell, Merges and Samuelson. Software and Internet Law, p. 35

quick idea

Windows 7 has changing desktop backgrounds – so do Ubuntu and OSX and even, ugh, Vista I bet – so I’m thinking, why not have a conversation play out through them?

I recently made a background which is just “Wyatt, go back to work. also, 42/13.37 ~= PI” – and just now it’s occurring to me that I could make more. A response such as “I am working. We are home. Shut up.” could shift in, in the same style – images, snippets of text, and various still media could be interwoven at random to create some kind of interesting ongoing tale of woe, joy, and humour in the background of my day.

I’ll have to do that sometime when I don’t have a midterm in 6 hours, a philosophy paper nearly 2 weeks overdue, another philosophy paper to write by 5pm tomorrow, a good blog post to do by midnight, another midterm at 9:00 AM the following morning (on the same topic as that overdue philosophy paper), grading software to write, and a major programming competition to prepare for. And a business idea to bring to fruition, which is taking a backseat to things it really should not have to.

tangent: that blog post will probably be on the founding of the Free Software Foundation and the dawn of software copyright; I remember reading stuff about that a few years ago and it was pretty cool. Back in the day (I think, 1976 was when it changed. 197x for sure), UNIX wasn’t shipped as binaries, it came as source. You bought the source and compiled it. If you wanted to make changes, see how it was written, copy it and change it, that was cool. Then machines became a bit more standardised and it made sense to ship binaries. Software as a business became feasible. Stallman and his MIT chums cried, and William H. Gates III did too – but in joy.

Okay so probably more like Stallman and them stroked their computer-science beards and Gates nodded approvingly and wrote a letter.