Deprecated: Methods with the same name as their class will not be constructors in a future version of PHP; AP_Twitter_Follow_Button has a deprecated constructor in /var/www/vhosts/gregmaclellan.com/httpdocs/blog/wp-content/plugins/ap-twitter-follow-button/ap-twitter-follow-button.php on line 14

Deprecated: Methods with the same name as their class will not be constructors in a future version of PHP; latest_twitter_widget has a deprecated constructor in /var/www/vhosts/gregmaclellan.com/httpdocs/blog/wp-content/plugins/latest-twitter-sidebar-widget/latest_twitter_widget.php on line 12

Deprecated: Methods with the same name as their class will not be constructors in a future version of PHP; wpSpamFree has a deprecated constructor in /var/www/vhosts/gregmaclellan.com/httpdocs/blog/wp-content/plugins/wp-spamfree/wp-spamfree.php on line 3890
Greg MacLellan » Technology

Greg MacLellan

September 22, 2009

Presario Power Button Hack

Filed under: Technology — groogs @ 10:10 pm

A friend brought over a Compaq Presario x1000 with a temperamental power button (which took many many presses to turn on), so I agreed to have a look.

First thing I did was disassemble it, which I didn’t really document. I took the back (partially) off, which I don’t think helped – really, the bezel above the keyboard (where the power button is) is important, and a couple screws in the back to take the keyboard off are probably all you need. This part by far took the longest.

imgp4148 Next thing I did was verify it was in fact the power button at fault. Pressing it manually did the same as the plastic button above (which just pushed down on this) – which is to say, nothing. Using a probe to manually connect the pins instantly turned it on every time, and there was nothing visibly wrong with the solder connections, confirming it really was the switch that was defective.

imgp4150I looked around for a similar button to use, or something I could jam in that would work, but didn’t really have anything suitable. I decided instead to re-purpose the mute button (hopefully he will at least change the power settings to put it in suspend, rather than turn off.. I probably should have suggested that in retrospect). It had a ribbon cable connecting it to the motherboard, and shared a common ground with the power button, so I just had to route it over to the power button. After finding the correct wire with an ohm meter, I pulled it out from the ribbon.

imgp4154 Finally, some soldering and that’s it. The mute button now functions as power, and the old power button does nothing. I haven’t yet heard how many times he’s accidentally turned off the system..


imgp4155 imgp4151

May 29, 2008

SvnMergeGui

Filed under: Code,General,Technology — groogs @ 2:01 am

I just created a Google Code project for a quick and dirty little app I built, SvnMergeGui. As you may suspect, it’s a GUI to SvnMerge.

It uses ClickOnce for the installation (because it was really easy to do from VisualStudio), and is pointed at the svn repository for updates – so although I’ve never tried it, the theory is that it will have automatic updates built-in. You can download the install files from the project page linked above.

April 4, 2008

UI First Design

Filed under: Code,Technology — groogs @ 12:24 am

Often when I am writing software I design the user interface first, or at least separately, from the rest of the code. Jeff Atwood describes UI First software development on his blog:

Of course, UI is hard, far harder than coding for developers. It’s tempting to skip the tough part and do what comes naturally — start banging away in a code window with no real thought given to how the user will interact with the features you’re building.

Remember, to the end user, the interface is the application. Doesn’t it make sense to think about that before firing up the compiler?

I found this particularly interesting and timely in light of the topic being the brunt of a recent April fool’s day joke on The Daily WTF. When I read that post, I actually found it surprising that programmers would think it satirical to consider designing the user interface first. Of course, the rest of the post gets a bit more crazy but I won’t touch that part.

Jeff talks more about the benefits of paper prototyping:

Paper prototypes are usually pitched in terms of doing low-fi usability studies, and rightly so. But I find a paper prototype tremendously helpful even if I’m the only one that ever sees it. I need to create an image in my mind of what I’m building, as it will be seen by the world, before I start pouring the concrete to make it real.

Figuring out the GUI definitely helps you get into the guts of what you need to accomplish in the end, and so it is definitely a helpful task. I think there is another tremendous benefit that to this however: developing a nice user interface.

Okay, so this seems like a less than earth-shattering revelation. Many programmers totally miss the mark on this though. Consider the traditional approach: write a command-line app or hardcode everything in, get the code working, and then write a GUI on top of it to expose to the end user. By doing this, you automatically box yourself into writing a GUI based on your underlying API or database layout.

I think there is a big benefit to taking a step back, totally forgetting about all the underlying API stuff you did, and think up a design for the best possible UI that you can for the problem you’re trying to solve. Pretend you’ll be the one using it every day for the rest of your life.

What you come up with may be totally incompatible with your API; that is okay. In fact, it may even point out fundamental flaws with the way you’ve done things (hence the benefit of the initial paper prototype..). It may work exactly with your code. It may require a translation layer of sorts, to convert what the underlying code is doing to how things are represented in the UI. It may also require compromises in the UI to be able to write a translation layer. All of these are good outcomes though because the end result is you generally have a much nicer UI than you would get if you just stick a pretty face on the raw API.

To use a simple example, consider the Google MyMaps interface. Underneath, the database is storing locations using their latitude and longitude, and assigning a name to them. The ultimate simple interface to this would be:

Simple "add a point" UI

Of course, what Google actually did makes much more sense – even though it involves a whole lot of code that translates mouse clicks on the screen into map pixel offsets, and then those into latitude/longitude:

Good "add a point" UI

In a lot of situations, the success of your software can come down to how nice and/or easy to use the GUI is, regardless of how powerful it is underneath (I’m sure you can think up many examples of this .. perhaps in say, the operating system world). Whether or not you actually design the UI first is inconsequential – ultimately you need to design the UI for the end users, not for the API.

January 17, 2007

PHP mail() logging

Filed under: Code,General,Technology — groogs @ 12:33 pm

I’ve posted my php sendmail wrapper before, but I just noticed that Ilia Alshanetsky has written a php mail logging patch that essentially does the same thing, but from within PHP itself. This is nice because it can log the full path of the file and line where mail() was called, whereas my script can only log as much info as PHP passes to sendmail (which isn’t very much) and what it can get from the environment. The downside is since it’s a patch, it requires recompiling – my script can be dropped into any installation (PHP4/5, and maybe even 3) and just requires a simple php.ini change.

I should also point out that if you’re using this, you should be sure that you don’t “whitelist” localhost in your mail server, or otherwise people can just connect to your SMTP server locally, and send mail without requiring a username or password. If they use SMTP you can’t see what script or virtual host sent the mail either way, but at least if you require authentication you can see what account is being used if it becomes an issue.

January 9, 2007

freePBX 2.2.0

Filed under: General,Technology,Telephony — groogs @ 2:21 am

I haven’t posted here in a while, so I just thought I’d mention something about freePBX 2.2.0, which was just released a couple days ago. For those that don’t know, freePBX is an open source configuration and web-based interface to Asterisk, which allows you to configure and run a PBX that has the equivalent functionality to commercial PBXs costing several thousands of dollars (or more). I’ve been involved with the freePBX project since December 2004 (when it was called AMPortal, or AMP for short), and minus a 6 month hiatus in early 2006, I have been contributing to the project ever since.

2.2.0 is a fairly signifigant release, adding a fair chunk of new functionality, fixing lots of bugs (over 200) including some long-standing bugs that have been around since the 1.x series. Among these are some bugs in the call handling that dealt with certain situations where you pass a call from a queue to a ring group, or going to a cell phone, and then forwarded back to another extension, etc, that were sometimes causing voicemail to never pick up, and some other strange behaviour. I was busy rewriting the modules API to make everything a bit more solid, and wrote a fancy new module administration interface. I also ported a nice new design done by Steven Fischer, which was a much needed upgrade from the basic look that the interface had from the start. I’ve also written some new modules (announcements, phonebook directory, misc applications, speeddial) and done some work on a half dozen others. Overall, we’re quite happy with this release and definately suggest that anyone using 2.x upgrades.

Going forward, there’s a few things I’d really like to do – write some hooks to use QuickForm, to make writing GUI code a lot simpler; finish my text-to-speech and manualconditions modules (which I’ve started on already); finish the daemon to write config files (instead of having the web server invoke a script); write the framework for a user portal; and add a menu before going to voicemail to allow callers to do various things besides leave voicemail.

Now, if I just had a clone or two that could my other day-to-day tasks like go to work, I would be set.

July 23, 2006

MythTV + PVR150

Filed under: Code,General,Technology — groogs @ 10:21 pm

I have been using MythTV in my living room for the last couple of months, and it’s quite a nice setup. Originally, I got it as a media player after the DSM-320 didn’t live up to my expectations (it’s still usable, but it’s been relegated to the 13″ TV in the bedroom).

MythTV main menuWe don’t actually subscribe to cable, and only get a few network channels that ‘leak’ through from the cable internet, so I never really intended it to act as a PVR. One day I happened to see a good deal on a Hauppage PVR-150 that included a remote, so I figured what the heck.

Recordings screenIt’s nice to have it record a few shows every day, and the “only keep x episodes” feature is handy. I even have it recording the local 6 o’clock news (and only keeping 1 episode) since I never usually watch it at that time. It’s also nice to always have a few episodes of The Simspons to pick through.

Media Library screenI’ve never been a slave to the TV schedule, I’d rather just not watch something than re-arrange the rest of my life around a tv show. For shows that interest me enough, I’ll download them and watch them at my leisure, and never miss an episode or watch them out of order. Having a PVR to do that just makes things easier.

Live TVOverall, myth was fairly straightforward to get working. I installed it on a Debian Sarge box, from source, along with ivtv and lircd.

Program GuideI wanted to post some of my config files, particularly for the remote setup, since it was very difficult to find the files for these, and for some unknown reason almost no one has posted complete configs (that have all the buttons configured) for the remotes.

  • /etc/lircd.conf – Remote definitions for various Hauppage remotes
  • /home/mythtv/.mythtv/lircrc – Mapping of remote buttons to MythTV commands

Caller ID on screen displayI also have asterisk and FreePBX installed to run my phones (I’ll write another post about that another time). One of the nice things about it is I have an on-screen popup when someone calls. I’ve written instructions on how to set up freepbx with mythtv osd on the FreePBX documentation wiki.

It does take a bit of reading and a bit of playing around, but it’s well worthwhile to setup MythTV as a PVR.

By the way: sorry about the crappy quality images for the live TV, it’s from my cell phone camera. I couldn’t take a screenshot of the video output (it just came out blue, like in the program guide picture).

June 9, 2006

Where did my insert key go?!?

Filed under: General,Rants,Technology — groogs @ 12:11 am

I spent some time recently trying to find a new keyboard, since my old one has had enough spilled on or in it over the years that it was uncleanable (still worked fine, but it was getting pretty gross). Now, I don’t know what keyboard designers are on these days, but they’ve made it an incredibly difficult and frustrating task.

What happened to the insert key?I don’t have any special keyboard needs, I usually just buy the basic el-cheapo keyboard that gets the job done. So I went down to my local computer shop, as I usually do when I need computer parts. It seems that somebody decided that the ‘insert’ key is no longer useful, so they removed it and replaced it with a gigantic delete key. Now, insert is probably one of those keys you use more than you think you do. If you spend any amount of time in vim, it’s something that will have you screaming bloody murder.

Do we really need a huge delete key?I then went to the big box stores thinking that I’d be able to find something with their larger selection. Although most of their selection was wireless, every single keyboard had some weird layout. The only ones that actually had an insert key were the ‘natural’ keyboards, unfortunately I can’t type on those, because I hold my arms at that ‘natural’ angle with a regular keyboard. When I use a ‘natural’ keyboard, I have to stick my elbows way out and it gets very uncomfortable.

Yes, that is a power-off button where page up should be.As I spent time looking at these keyboards, I also noticed some other trends, like weird function key groupings (see first pic above, where F1-F6 are on one side, and F7-F12 on the other, or the second pic, where they’re grouped by 3’s instead of 4’s). The absolute worst design has got to be the poor placement of power buttons. 15 minutes on of these keyboards and you’ll understand. I don’t know who decided to put a ‘power off’ button where ‘page up’ is supposed to be, but something horrible needs to happen to them.

You'd think these power buttons are small enough to be out of the way - they're not.It seems that keyboard designers have gotten bored or something, as they all love to mess with the layout of the cursor keys, insert/delete block, and function keys. Multimedia keys – even power buttons – are fine, but please just put them outside of the regular keyboard area!

I ended up buying a Logitech G15 (I’m sure I could have found and ordered a normal cheap one online, but hey, I was impatient that day), which is totally overkill for me since I don’t play games at all. I can’t even see the LCD, as it’s hidden under my desk because of the keyboard tray. I set up the programmable keys to open some commonly-used programs, but I rarely use them. What I do like about this keyboard though, is the backlighting (actually, that’s quite nice), the feel (light, but you know when you pressed a button), and most of all: it has an insert key.

May 16, 2006

Macs 13% more than PCs? Try 75%..

Filed under: General,Technology — groogs @ 2:39 pm

AppleInsider is running an article saying that Macs are only 13% more than a comparable windows desktop, or 10% for a comparable notebook. They did this research to squash the notion that Macs are way more expensive than PCs.
One of my problems is the cheapest Mac – the Core Solo – is still $699 CAD. At my local computer store, you can buy a basic PC for $279 plus $120 for a copy of Windows XP Home. Sure, this isn’t “comparable” to the Core Solo in terms of components, but in terms of functionality pretty much everything is there, and certainly it is for someone who just wants a PC for web browsing, email, to listen to and burn music, and the occasional word processing task.
So what else is the Mac Mini offering that makes it worth that much more money? The small form factor? Okay, that’s neat, but I have lots of space, and don’t care about the size. Firewire, wireless ethernet, and a remote? Again, I don’t need any of them.
So for someone looking to get a basic computer, it seems to me that an Apple is 75% more. This is a far cry from 13%. This is even a place Apple should be looking to convert users, as these basic tasks do not require any Windows-only software — typically the sticking point that prevents switching. They’ve effectively priced themselves out by offering too-fancy hardware with options many users don’t need. I know the Mac Mini was Apple’s answer to offering a low-cost PC, and compared to traditional Apple prices, it is low-cost.. but that’s sad in itself.

I think Apple should really create an introductory system that targets a sub-$500 (CDN) price point, and market it against the basic PC. But then, maybe I’m off the mark here. Maybe people are really willing to spend $700+ on a system that is 10x more than what they need (Future Shop and Best Buy seem to be selling computers, after all..).

I have never owned a Mac, but would love to buy one — unfortunately I just can’t justify it when I can buy a PC that meets my needs for so much less (less than half the price, running Debian Linux). Maybe someday Apple will learn..

March 9, 2006

Time to ditch those CD sets..

Filed under: General,Technology — groogs @ 9:11 pm

Something that really boggles my mind: Why do all the distros still push their 3 or 7-CD or DVD sets as the main way to download them? It’s 2006 for crying out loud. My distro of choice these days is debian, and I’ve been using the net-install ISO for a couple years now.

Waste of Bandwidth

Most distros come with, well basically, everything. Interestingly, most people use one desktop environment. One internet browser. One mail client. One office suite. Many many TB’s of bandwidth are wasted by the extra programs people are downloading, but never use. This puts undue strain on all the sites hosting, including the mirrors that donate their bandwidth.

With a network-based install, you just download a small, bootable CD with the basic OS on it. Debian’s net-install is 100MB. The installer has the ability to download packages from web, ftp, and nfs servers, as you select which packages you actually want.

Quickly Outdated

When you burn a distro on CD, it pretty much goes obsolete immediately. The longer you wait, the more packages get updates. If you install something a month or two old, then chances are the first time you run the update utility, it’s going to download a large number of packages again, because they’ve been updated. So now you have a whole ton of packages on CDs – many of which you’ll never use – and once you’re done installing, you’re goig to have to download most of them again.

Broadband: It’s Everywhere

A very high percentage of people have broadband access (in USA, estimated 29% rural, 39% urban – probably higher in the rest of the world), and I’d be willing to bet that the percentage among people installing Linux (ie, the techies) is much higher. With a good connection, downloading doesn’t even take that much longer than copying off a CD. (And just think, with all the extra bandwidth from people mostly doing net-installs, the mirrors will be able to go even faster!)

For dialup users or people that want to install on a standalone machine, it makes sense to keep the CDs around.
Faster Install

Instead of having to wait to download 2 or 3 CDs, and then do the installation, you can just download the 100MB image (or use an old one you have laying around, as if it’s been well-designed, it will basically never go out of date) and then download only the packages you need (which should be much less than 3 CDs worth). This even makes sense for dialup users, if they don’t otherwise have access to broadband to get the full CD set.

Drawbacks

There are drawbacks to the net-install. If the network hardware isn’t supported by the installer, then obviously it won’t work. For a new user, a net install may be confusing since it’s so different from the typical OS install. I don’t think these are huge issues though, as long as the user interface is well designed and enough drivers are included (and the user isn’t using some obscure and/or obsolete network hardware).

I’d love to hear comments on this, about why it is or isn’t a good idea, and why more distros have not adopted the network install method.

March 6, 2006

Zend Framework

Filed under: General,Technology — groogs @ 7:25 pm

The much-anticipated Zend Framework was released a couple of days ago, and I finally got around to looking at it today. I really wasn’t impressed much.

It’s really not much more than PEAR, with a sloppy MVC framework tacked on. I certainly don’t think it’s not a useful library: the Zend_InputFilter class looks very handy, and the Zend_Db stuff is an interesting implementation (though, I’m not sure if I’ll be switching away from ADOdb anytime soon).
I’m still not currently using a formal MVC framework (I haven’t yet found one I really like – though I do implement something similar the view-controller part without object oriented code), and by the looks of things, I won’t be building any applications using just the Zend Framework, either.

Okay, so it’s written by the people behind PHP. Is this really the best benefit it has over any other framework? I certainly don’t seem to be alone in thinking this way.

February 28, 2006

Shiny New Laptop

Filed under: General,Technology — groogs @ 12:51 am

I just got a new Dell Inspirion 6000 to replace my older Inspirion 1100. There was a nice promo on the 6000, and I have a Pentium-M 740, 1GB of RAM, and a 60GB hard drive, as well as integrated Wireless b/g, bluetooth, a DVD writer (not that I will ever use this in a laptop..) and a 15.4″ widescreen LCD.

Inspiron 6000The first thing I did was try to install our corporate copy of Windows 2000 Professional, to bring it inline with the rest of the systems at work. I had a few devices not recoginized, so I re-installed with XP Pro on the advice of a friend. I ran into the same problems, but eventually worked my way through them. I’m pretty sure that I could have fixed the problem with Windows 2000 as well, but I didn’t feel like re-installing again.

The bluetooth driver was most problematic. The Dell downloads didn’t even show bluetooth drivers when I selected my machine, or entered my service tag. I actually thought that by entering my service tag they should have only shown me drivers for my exact machine, instead of the 4 different possible wireless options, display options, etc. After looking at my invoice and doing some searching, I found the bluetooth driver.

One big thing for me is the keyboard layout. My old Inspirion 1100 had a strange layout, with home, pgup/down, and end along the right side of the keyboard, with insert and delete to the left of the cursor keys. It took me a while but I eventually got used to it. This keyboard now has a layout closer to a normal 104-key keyboard, with the aforementioned keys grouped together in the top right, in basically the same configuration as a normal keyboard. I still reach for the delete key in the wrong place, but it’s certainly easier to get used to. I’ve yet to understand why nobody puts a full-size keyboard (minus the number pad) on a widescreen laptop. I guess they figure the 1″ of nothing on each side is a better use of space?
I’m more than impressed with the battery life. I got the 8-cell 80-WHr battery, which is the biggest you can get for it, and on a full charge, with bluetooth and WiFi turned on I get 5 hours. I think that’s pretty impressive for a laptop with a large widescreen display. It is definately a refreshing change from the dying battery in my 1100 that would get about an hour.

I’ve had some issues with wireless just “stopping”: the WiFi light goes out, and it loses all connectivity, even though the icon in the task tray still shows a good signal. I have to disable and re-enable WiFi using the keyboard (Fn + F2) to restore the connection. It sometimes also takes a couple of minutes to get an address from the DHCP server. At work, I’ve watched the logs, and the DHCP server sees the requests, and offers an address, but the laptop simply doesn’t take it.

This machine has quickly become my primary computer, both at work and home. I use it for everything from browsing the web and checking email, to programming, to working in photoshop. Though I never thought I’d say this, the 1GB of ram really helps for someone like me, who keeps 7 or 8 applications open at any time (it’s not uncommong for me to have 4 instances of visual studio, or a bunch of firefox tabs and windows, or both).

Overall, I’m quite happy with this laptop, and I would recommend others looking for a decent system at a reasonable price.

February 6, 2006

Weird Spam

Filed under: General,Technology — groogs @ 9:04 pm

I’ve been getting a ton of comment spam today, but it’s very strange spam. All the links go to different places (Forbes, Bloglines, Sciencedaily) that don’t seem like the types to be comment spamming.

Usually the first sentence or two is something like “Found your site very useful”, “Cool site! Be back soon!”, followed by some nonsensical sentences with the links to these weird sites. I got about a dozen of them, all posted around 10-20 minutes apart. I don’t have any captchas to crack, so I’m not sure what the delay is about.All

the posts are from different IPs, using different email addresses, and different home pages (again, pointing at the same sorts of inoccuous sites)

Is anyone else experiencing this, and does anyone have insight onto what would be the purpose?

December 16, 2005

Sendmail Wrapper

Filed under: Code,Technology — groogs @ 11:30 am

We had some spam problems last week, one of them caused by a form that wasn’t properly escaped. While that problem was fixed, the real problem was that it was hard to figure out what script had the issue.

To solve this, I wrote a sendmail wrapper for use by PHP (though really it could be used by anything) that logs the message along with the date, a message id (also inserted in th e headers) and the current directory (which gives the location of the original script).

It also extracts out the domain name from the current directory, but this is server specific so you’ll need to change the pattern to match your file system.

Eventually I’d like to include support to check for a maximum number of recipients, and maybe some other heuristics to check for spam.

You can get the script at:
http://gregmaclellan.com/php/sendmail.phps 

You should save this file as /usr/local/sbin/sendmail_logged.

The reporting script is at:
http://gregmaclellan.com/php/mailreporting.sh

There are instructions in this file for how to add it to cron.

Let me know if you have any comments, suggestions, do any improvements, or find any bugs.

 

December 5, 2005

Seach: Time vs Relvancy

Filed under: General,Technology — groogs @ 11:25 am

Something that seems to be missing from searches is time. Search engines base their results on relevancy, which makes finding newer methods of doing something difficult.

For example, I will search for how to do something in linux, like configuring a RAID array. There is a ton of information on this, but the most relevant hits you get are about configuring raidtools. Mdadm has replaced raidtools as the tool of choice, but since raidtools has been around so long, and there are so many old pages that link to it, it scores the highest. I’m sure there’s millions of other examples of this on other topics too.

Google has an advanced search where you can specify pages modified in the last x months, but it doesn’t really help much. One of the pages returned when I limit the search to the last 3 months has a revision history typed out at the top of it, and it shows the last update in 2003. MSN has a "Search builder" function, where (among other options) you can specify how important it is to be recently updated, popular, and a relevant match. This still doesn’t bring up really relevant results. Yahoo is the only one of the three that actually does return an mdadm-related result as #1 when you search within the last 3 months. (I should point out that both Google and Yahoo return this same page as #5 and #6, respectively, but my point here is that someone who knows nothing about it is probably going to pick #1 or #2, and implement raid with the older raidtools method).

MSN’s search-tuning functions

All three have a news search engine that returns date-based results for recent news items, but this is pretty limited in that it’s only searching news sites. Linux software RAID developments aren’t exactly breaking news on CNN, so the news search isn’t exactly the place to find this stuff.

I think one problem with the date-based results as they are now is the way they are likely determining the date of the page. If they are using the last modified header (part of HTTP specifications), then that would explain a lot of the problems. It’s quite possible that the last-modified header is changed due to content that is dynamically created, content that is moved with ftp to another server, copying without preserving date/time or even a misconfigured webserver. What they should be doing is comparing the contents of the page to the contents the last time they indexed. It wouldn’t be totally accurate (depending on how often they index the page), but it would at least give a real representation of when the contents were changed. They would have to ignore dynamic things like ads and current date displays (via pattern matching) but it wouldn’t be that complicated.

Hopefully it’s just a matter of time…

On the topic of search engines, I came across a few new Google features while researching for this entry that I didn’t know about:

November 17, 2005

SOAP: Gives it a REST

Filed under: General,Technology — groogs @ 11:51 pm

I’ve noticed in the last little while that there seems to be a trend happening with web services: people think SOAP is too complex. I keep coming across articles and comments talking about how web services are just over engineered. I have to say, I totally agree.

Here’s some excripts from a c|net article:

A debate is raging over whether the number of specifications based on Extensible Markup Language (XML), defining everything from how to add security to where to send data, has mushroomed out of control.

Tim Bray, co-inventor of XML and director of Web technologies at Sun Microsystems, said recently that Web services standards have become “bloated, opaque and insanely complex.”

This isn’t something that’s new. An onlamp article from 2003 talks about how people use REST over SOAP:

While SOAP gets all the press, there are signs REST is the Web service that people actually use. Since Amazon.com has both SOAP and REST APIs, they’re a great way to measure usage trends. Sure enough, at OSCon, Jeff Barr, Amazon.com’s Web Services Evangelist, revealed that Amazon handles more REST than SOAP requests.

Personally, I’ve always gone with so-called REST interfaces if I have a choice. I’ve in fact been using REST for many years, without realizing it was called REST (the term, which stands for Representational State Transfer, was coined by Roy Fielding in his doctoral dissertation in 2000).

Put simply, SOAP just requires so much setup and overhead to do what should be a simple task. As a programmer, I like to actually make working code, and I hate writing tons of ‘helper’ code that essentially doesn’t actually do anything. That’s what I feel like I’m doing when working with SOAP and writing WSDL schemas and all the extra junk. There are APIs to make things eaiser, but in the end it’s still an over-engineered protocol.

REST keeps it simple, with really no formal definition. It’s more a method than anything else. Go to a URL, get a bunch of data back. Talk about simple.

Advocates of REST push that you should only even use normal HTTP GET requests for retreiving data (as opposed to POSTing a complex XML query like SOAP does). This makes sense, as one of the ideas behind the web to begin with is that any piece of information can be obtained with a URI. This makes REST rediculsly simple, and for this reason some people hate it, others love it. It definately makes thing easy to debug, as you can test responses in your browser.

Most REST services return a response in XML, but they don’t have to. If all you’re trying to get is one piece of information, it’s just as easy to just return that information in the body, with no tags at all. The querying application doesn’t even have to parse anything. Obviously this has it’s downfalls (like not being easy to returning error conditions or multiple values), which is probably the reason reponses are usually XML. Of course it helps that there are XML parses available for virtually every language, including JavaScript. In fact, REST is basically what drives most AJAX applications.

Of course, SOAP is XML and therefore human-readable as well, but let’s look at the diferences with a small example.

SOAP example

This is an example SOAP request for getting the price of a product.

Request:

POST /SOAP HTTP/1.1 Host: www.ecommercesite.com Content-Type: text/xml; charset="utf-8" Content-Length: nnnn SOAPAction: "Some-URI"

<SOAP-ENV:Envelope xmlns:SOAP-ENV="http://schemas.xmlsoap.org/soap/envelope/" SOAP-ENV:encodingStyle="http://schemas.xmlsoap.org/soap/encoding/"> <SOAP-ENV:Body> <m:GetMarketPrice xmlns:m="Some-URI"> <symbol>PART304285</symbol> </m:GetMarketPrice> </SOAP-ENV:Body> </SOAP-ENV:Envelope>

Response:

HTTP/1.1 200 OK Content-Type: text/xml; charset="utf-8" Content-Length: nnnn

<SOAP-ENV:Envelope xmlns:SOAP-ENV="http://schemas.xmlsoap.org/soap/envelope/" SOAP-ENV:encodingStyle="http://schemas.xmlsoap.org/soap/encoding/"/> <SOAP-ENV:Body> <m:GetMarketPriceResponse xmlns:m="Some-URI"> <Price>50.25</Price> </m:GetMarketPriceResponse> </SOAP-ENV:Body> </SOAP-ENV:Envelope>

REST example

Just a simple GET query:

GET /REST/getprice?symbol=PART304285 HTTP/1.1
Host: www.ecommercesite.com
Content-Length: nnnn


and the response:

HTTP/1.1 200 OK
Content-Type: text/xml; charset=”utf-8″
Content-Length: nnnn

<price>50.25</price>

Considering they both do the same thing.. which one looks simpler?

Of course, for a REST service to be useful, just like any other API or tool, it needs to be well documented. This means documenting the request parameters, all the things it can do, as well as all the output formats. SOAP has WSDL that I guess provides this information (to a point, and assuming you know enough to make sense of it all), but I don’t think that feature alone is worth all the other baggage SOAP carries.

To be honest, I also only have limited experience using SOAP (for the reasons that I’ve outlined in this entire post), so I’m quite willing to hear arguments to convince me why SOAP could be more beneficial than REST. At this point however, I can only conclude that SOAP is just over-engineered. Why bother coding all that extra junk when you can just REST? :)

November 16, 2005

GoogleBar

Filed under: General,Technology — groogs @ 6:55 am

Finally got around to updating some of my Firefox plugins, and noticed a neat new enhancement to the Google toolbar. When you’re typing in the search bar, they added a ‘suggest’ feature (where it finishes words and complete search phrases for you), and even better (since suggest isn’t really new), is it shows you the number of results. (googlebar image) Even though there is a google search bar built in to Firefox, I still really like the toolbar. Notably, I use the "search this site" button a lot. Even when sites do have a search function, Google often just provides better results. Pagerank display is good, and the highlight function (that will highlight search terms on the page) is really nice on long pages. If you haven’t tried it before, I do recommend installing the google bar for a few days. It does take up some screen real-estate, but I think it’s worth it.