MacOSX vs. Ubuntu

I’ve been toying around with Ubuntu linux, seeing if I could make the move over to that side of things full time. It’s gotten so much better over the last couple of years, that it’s finally a viable fulltime desktop environment. The Ubuntu distro has an almost perfect balance of ease-of-use and hardcore-geek-utility. apt-get is great (but hardly user friendly).

Brian’s been working on making the switch from MacOSX to Ubuntu (or UbuntuStudio), so I’ve been thinking about it again.

Most of the apps I live in are there (Firefox is a good enough browser – even if it isn’t Safari, Thunderbird is a good enough mail app – even if it isn’t Mail.app). It’s got all of the server stuff I use either pre-installed or a simple apt-get away.

The one really killer app that is keeping me on MacOSX is Aperture. Nothing comes close on the Linux side of things. Nothing.

There are also tons of niceties in general use in MacOSX. Too many to list. And I’d miss them if I switched to Ubuntu.

Of course, with Parallels, it doesn’t have to be an either-or kind of thing. I can run Ubuntu inside MacOSX on my MacBook Pro. But, if I’m already running MacOSX, Linux is a bit redundant…

Ubuntu Server Not Seeing Localhost?

I’ve been setting up a shared Drupal hosting environment on an Ubuntu Server box, and just about everything is running great. Drupal’s running, MysQL is running, and everything feels nice and fast.

But, the server can’t see itself on the network. It can’t even ping itself (via 127.0.0.1, localhost, or either of the domains pointing to the box). It can ping other boxes, though. It can’t curl or wget or lynx any of the sites on itself. It can’t telnet to its own services (which makes setting up mail services etc… a bit tricky).

The bizarre thing is, I can have full access to the services on that box remotely. SSH, FTP, ping, HTTP, etc… are all up and running, and respond normally to requests from off-machine.

None of that would be fatal, but the result of not being able to even curl a URL on the same box, means I can’t run a script to automatically run cron.php on all Drupal sites.

I’m not sure what might need tweaking to enable the box to see itself over TCP/IP. I’ve checked in /etc/hosts, I’ve checked apache2 configs (nothing is rejecting from localhost).

Any ideas?

I’ve been setting up a shared Drupal hosting environment on an Ubuntu Server box, and just about everything is running great. Drupal’s running, MysQL is running, and everything feels nice and fast.

But, the server can’t see itself on the network. It can’t even ping itself (via 127.0.0.1, localhost, or either of the domains pointing to the box). It can ping other boxes, though. It can’t curl or wget or lynx any of the sites on itself. It can’t telnet to its own services (which makes setting up mail services etc… a bit tricky).

The bizarre thing is, I can have full access to the services on that box remotely. SSH, FTP, ping, HTTP, etc… are all up and running, and respond normally to requests from off-machine.

None of that would be fatal, but the result of not being able to even curl a URL on the same box, means I can’t run a script to automatically run cron.php on all Drupal sites.

I’m not sure what might need tweaking to enable the box to see itself over TCP/IP. I’ve checked in /etc/hosts, I’ve checked apache2 configs (nothing is rejecting from localhost).

Any ideas?

On Setting up an Ubuntu Server

For a project I’m involved with, we’re setting up a shiny new server to handle hosting of lots (and lots) of Drupal sites in a shared hosting environment. We were able to pick up a decently speced Dell PowerEdge 2950 at a really good price. Dell wanted a tonne of cash to pre-install RedHat on the box. Um, no thanks. So, our friendly neighbourhood colocation provider installed Ubuntu Server on the box for me (I’m about 1000 km from the server, so couldn’t actually do the physical install myself). The PowerEdge is a 2xdual core Xeon, similarly speced as the new Xeon XServes, but not as nicely packaged. This one requires 2U of rackspace, where the XServe is shoehorned into a single 1U slot.

We hit a minor snag with the configuration – the onboard NICs weren’t properly lighting up. Some quick Googling, and I believe the solution was found in this thread, and involved running this:

chroot /target
# fix initrd
echo megaraid_sas >> /etc/mkinitramfs/modules
cp /boot/initrd.img-2.6.15-26-amd64-server /boot/initrd.img-2.6.15-26-amd64-server.old
mkinitramfs -o /boot/initrd.img-2.6.15-26-amd64-server 2.6.15-26-amd64-server

After that, everything came up roses. Once I had my admin account, it was pretty trivial to get the rest of the bits set up. I had some minor stumbling when trying to build httrack from source, but a tip from this thread led me to the quick command to install the full developer’s toolkit:

sudo apt-get install build-essential

The whole apt-get stuff is pretty sweet. It’s what Fink and darwinports on MacOSX aspire to, but don’t quite reach. Want to install emacs? It’s just a quick sudo apt-get install emacs away. Easy peasy. Databases, ImageMagick, etc… All trivially installed and updated.

Ubuntu Server Logo

So far, setting this server up has been absolutely trivial. And it’s so stinky fast that it should serve the project for quite some time. I might need to set up an Ubuntu client or server locally to play a bit more. It’s not quite MacOSX, but from a server perspective, it’s pretty close. Actually, having spent about 6+ years dabbling in MacOSX’s UNIXy innards, running an Ubuntu Server is not much of a stretch. The biggest adjustment is learning where all of the various bits are installed, but that’s easy. I’ll be spending a fair bit of time over the break, getting my feet wet in Ubuntu. Should be fun!

For a project I’m involved with, we’re setting up a shiny new server to handle hosting of lots (and lots) of Drupal sites in a shared hosting environment. We were able to pick up a decently speced Dell PowerEdge 2950 at a really good price. Dell wanted a tonne of cash to pre-install RedHat on the box. Um, no thanks. So, our friendly neighbourhood colocation provider installed Ubuntu Server on the box for me (I’m about 1000 km from the server, so couldn’t actually do the physical install myself). The PowerEdge is a 2xdual core Xeon, similarly speced as the new Xeon XServes, but not as nicely packaged. This one requires 2U of rackspace, where the XServe is shoehorned into a single 1U slot.

We hit a minor snag with the configuration – the onboard NICs weren’t properly lighting up. Some quick Googling, and I believe the solution was found in this thread, and involved running this:

chroot /target
# fix initrd
echo megaraid_sas >> /etc/mkinitramfs/modules
cp /boot/initrd.img-2.6.15-26-amd64-server /boot/initrd.img-2.6.15-26-amd64-server.old
mkinitramfs -o /boot/initrd.img-2.6.15-26-amd64-server 2.6.15-26-amd64-server

After that, everything came up roses. Once I had my admin account, it was pretty trivial to get the rest of the bits set up. I had some minor stumbling when trying to build httrack from source, but a tip from this thread led me to the quick command to install the full developer’s toolkit:

sudo apt-get install build-essential

The whole apt-get stuff is pretty sweet. It’s what Fink and darwinports on MacOSX aspire to, but don’t quite reach. Want to install emacs? It’s just a quick sudo apt-get install emacs away. Easy peasy. Databases, ImageMagick, etc… All trivially installed and updated.

Ubuntu Server Logo

So far, setting this server up has been absolutely trivial. And it’s so stinky fast that it should serve the project for quite some time. I might need to set up an Ubuntu client or server locally to play a bit more. It’s not quite MacOSX, but from a server perspective, it’s pretty close. Actually, having spent about 6+ years dabbling in MacOSX’s UNIXy innards, running an Ubuntu Server is not much of a stretch. The biggest adjustment is learning where all of the various bits are installed, but that’s easy. I’ll be spending a fair bit of time over the break, getting my feet wet in Ubuntu. Should be fun!