Tuesday, November 30, 2010

The best investment advice you'll never get

This is a great article from San Francisco online "The best investment advice you'll never get", if you can spare ten minutes or so and are at all interested in earning steady investment income (and who isn't?) then I highly recommend giving this a read.

Basically, unless you're a gun investor and you have heaps of time to invest you're not going to beat the index and you're better off putting your cash into low cost index funds that mimic the company spread of the major indices. There are some good quotes through out.
“Invest in nonprofit index funds,” David Swensen says unequivocally. “Your odds of beating the market in an actively managed fund are less than 1 in 100.”

“Buy an index fund. This is the most actionable, most mathematically supported, short-form investment advice ever.”, the Motley Fool.
And plenty of good arguments to back up the points made by the various professionals and experts that are quoted. Interestingly enough the article actually begins by describing what the people at Google did just before the IPO in 2005, knowing that with the IPO a whole bunch of programmers and engineers where going to get their hands on a lot of cash they thought it prudent to get them some advice.

Rather than letting in the rabid hordes of fund managers and the like that were gathering outside like flies around a trash can, they brought in impartial investment advisors, lecturers and fund managers that weren't there to get a profit out of the pundits.

None of the companies that I have worked for have offered advice like this to me, so I have to find it the hard way, by trawling the internet. It would be great if more people where taught sound investment and monetary practices but alas I don't see that happening and besides if you're not the investment fool then you're the one prying the money from the fools hands right?

Monday, November 29, 2010

Vetta RT 77 Cycle computer Follow Up

Since my earlier review of the Vetta RT 77, I've been using it on my commute to and from work. Overall I am pretty happy with the purchase, especially as I got it for the princely sum of $17.

It does however have a couple of foibles, the slightly hidden cadence functionality and the left button doesn't always work properly. Obviously you get what you pay for I guess so I am not really disappointed with the quality at all and as I continue to use it the buttons seem to be getting better.

Overall I am still very happy with my purchase but if you want your computer to be perfect from the get go I recommend spending more that $20.

Saturday, November 27, 2010

Synology DS211J to Samsung Story 3 Backup

After getting my Disk Station setup as detailed in a couple of earlier posts (part 1 and part 2) it is time to complete the final part of my backup solution, which is to get the drive backing up to a external drive, that I will store somewhere else (like in the office). I plan on doing this once a month or so at this stage but probably more often in the future depending on how much data I am adding and needing to back up.

First impressions are that this thing is sexy, but then again it did win a design award so that is no surprise, no fuss nice solid case, with a nice big rotary switch on the front that makes a solid clunk when turned. Love it! Much better than all the cheap enclosures that I have used in the past. If you want to read a review with performance figures and all that stuff you can check out this one from the guys at StorageReview.

Anyway it does USB 3 which for me at this stage doesn't mean much as I don't have anything else that supports it the only question it does lead to is: Do I need a funky cable to connect to a USB 2 host? Well thanks to this easy to follow picture from this page which provides backwards compatibility info.

I still need the USB 3 cable to fit in the socket on the back of the drive but it is fine for me to use an old extension cable I had lying around which is great because it means that I can keep the NAS hidden away in the cupboard but still back up easily.

On the Synology side of things it could be much easier, although I did have a slight mental blank and lose the local backup option. Simply fire up the "Backup and Restore" section of the control panel and select "Create" to create a new back up type. This wil fire up the Backup Wizard.
This presents options to back up the NAS to 3 different destinations.
  1. Local Backup: Which backs up to a local USB drive. This is the option that I'm using.
  2. Network Backup: Where you can select anywhere on the network as a destination and it uses rsync or rsh to copy files.
  3. Amazon S3 Backup: Where you can back up to the Amazon Storage cloud and chose between encrypted or unencrypted storage, with a trade off in backup times with the encrypted option.
Once you select the backup type you pick which folders you want and the destination drive for the backup. You can also schedule the back up, which I will not be doing as I am not sure how often I will be connecting the external drive and will have to do the backups manually.

After the wizard completes you click go and the backup begins. In terms of backup speeds obviously we're restricted here by the USB speed. I am writing this while I wait for the backup to complete, currently it has completed 15 Gb in around half an hour giving a copy speed of a couple of minutes per gigabyte. I am not pressed for time or anything so the speed doesn't really concern me. I imagine that once the first full backup has complete I should be able to specify an incremental backup but I am not sure at the moment. 

Anyway, once again I am quite pleased with the ease of setup and use of this NAS and I have no complaints whatsoever about the external drive. In the future I'll get into setting up external, that is remote, access over the internet of the NAS and using the Torrent client.

How To Be The Dumbest Guy In The Room

How To Be The Dumbest Guy In The Room is a great post that kind of fits into my whole professional/personal development stint that I have been embarking on of late. Continuing on from my earlier post Do unto others as Carnegie would do unto you the linked post encourages you to be the 'dumbest' guy in the room so that you learn the most.

There are two parts to this really, surround your self with smart people and extend yourself to the point where you can fail and potentially look dumb. These are covered pretty well in Greg's post. I can kind of see a third part as well, that is not so much acting dumb but asking a lot of questions. Of course when you ask a lot of questions there is the potential that you appear dumb, but that is a risk worth taking. This is also what Carnegie would have you do, the asking a lot of questions that is. Getting to know the person that you are talking to. While the aims here are slightly different they use the same means.

As you encourage the other people to talk through lots of, potentially dumb sounding, questions you absorb some of their knowledge about whatever it is you're talking about and as long as you retain a bit of this knowledge you will be smarter because of it.

The aim is to be the dumbest guy in the room so that you end up being smarter.

Friday, November 26, 2010

Software Patents... still crap

Today I read on Dairy of an x264 Developer that Tandberg had patented, or rather applied for a patent on, some algorithmic techniques that Jason had used in his work on x264 and ffmpeg. While interesting, to me at least, the real point of this is to point out that it is really the flawed patent system that leads to this sort of application being made.

Not even entering into the idea that software patents are a bad idea, there is plenty in the media about that already, this illustrates some inherent flaws in the system. While obviously Tandberg in this instance have acted in a very bad manner, they have done so because the patent system not only allows such actions but even goes some way to encourage it. It is only because the developer working on this area, had the patent application brought to his attention that it was noticed. 

The problem, I feel, really lies in both the prior art and obviousness determinations made by the USPTO and the cost of challenging them. While I do believe that this patent will be stopped, similar patents copied from other peoples work have been granted in the past and have gotten through the approval process. The fact that the infringing patents have not been picked up is related to how prior art and obviousness are assessed. Ideas that are completely obvious, to the point of not being deemed worthy of patenting, to someone in a particular field, may not be at all obvious to someone outside that particular field.

How is it then that the USPTO is supposed to ensure that obviousness is tested thoroughly? Well, in all honesty they can't possibly ensure that some don't fall through the cracks. This is where the second and more insidious problem comes into the fore. A big company with a scary team of lawyers is basically always going to win against the little. If Joe Bloggs from Farmville Tennessee comes up with an idea that is stolen by a large company his small amount of funds to fight a patent battle will be swallowed in briefs and other such things from the large company. To the point where he can no longer challenge the validity of the patent for fear of not being able to feed his family.

The excessive cost of challenging patents has led to a situation where large companies, like Tandberg, feel comfortable attempting to patent things like this. The risk reward payoff is worth it for them. They are going to get a slap on the wrist at worst and at best they have a patent. This patent will then be used primarily as a negotiation tool with other big companies that think Tandberg has stolen some of their technology.

Basically, they come up and say, 'hey you're using this which we patented', then Tandberg comes back and says 'well you're using this which we've patented' and then they both agree to license the other to use their ideas and they live happily ever after.

Companies are willing to patent obvious things (touch screen computer anyone?) so they have a drawful of patents to wave at their competitors should the need arise and when they get desperate to start suing people in a vague attempt at perpetuating the companies existence. However, the big patent wars never really get anywhere in the long run and just seem to hinder the growth of new and exciting technologies.

So I'd written this whole ramble and then stumbled across a thread on Slashdot which was started by another coder that had the same thing happen to him. There is some pretty good discussion in there as well.

Killing IIS Zombie Processors

We've all been there! Well all developers and testers are least. You've got a process that is hung and you just can't kill the blasted thing. On *nix its as easy as getting the Process ID (PID) of the process and running 'kill -9' which is about as close as you're going to get to travelling back in time and killing the processes grandparents back when they lived on a pig farm.

Anyway, in Windows there is nothing that is there available to all and sundry to do this. But thankfully when I encountered this problem today I used some Google Fu to get to How To Kill Windows Processes That Won't Die which led me to the good folks at SysInternals and their PsTools suite.

PsTools is a collection of tools to help with system management, included within it is PsKill which, by all rights should solve my problem which is a pesky IIS worker process that refuses to roll over and play dead. So following the instructions I tried to kill the process.

No dice, access is denied. Hmm okay I guess I need to run it as administrator.
D:\Downloads\PsTools>pskill.exe 3256
PsKill v1.13 - Terminates processes on local or remote systems
Copyright (C) 1999-2009  Mark Russinovich
Sysinternals - www.sysinternals.com
Unable to kill process 3256:
Access is denied.
Are you kidding me? WTF? Damn it. This is going to be harder than expected. Off to poke around some more. There are a couple of tricks to try in the comments of the first post I found but none of them seem to work.

Sysinternals Forums? Nope, they only seem to contain Access Denied errors for remote computers. File locked by something? I'll give Unlocker a go. No dice!

Oh man I feel like I idiot! I thought I had tried 'iisreset' but apparently not because I just tried it again and it worked.

Son of a... well that was a waste of 15 minutes.

I think PsKill wanted me to kill it as the user that had started the process, which was a user that IIS created, so I didn't have the password. Lesson learned, my google fu isn't as fu-tastic as it should be. A more specific initial searched would've killed the process sooner.

Thursday, November 25, 2010

Election Junk Mail is giving me the shits!

I don't know if you other Victorians are affected like me but so far in this "Election Campaign" I have received way to much personally addressed propaganda for my liking. The optimal amount of course being none. I am quite capable of using the internet and the media saturation to find out everything I need about your polices.

If you're worried about your particular message not getting through perhaps you should consider using your TV and newspaper spots to actually talk about your policy and not just slander the opposition!

In the last month by my count I have received:

  • 2 letters from my local Liberal candidate @clemnewtonbrown
  • 2 letters from my local Labor candidate Tony Lupton
  • 2 letters from the Victorian Electoral Commission politely implying I am to dumb to figure out how to vote.
My FiancĂ©e has also received her fair share as well, for 5 of the 6 letters above we both received the same letter on the same day, with the same pamphlet in it. Such as waste of resources, if you really want to send the things (and you really don't) how about just sending one to the household?

So, lets say that there are 5.5 million Victorians at this point based on there being 5,473,266 in March. Lets assume that around 80% of them are of voting age and lets say that each of them receives 5 letters. That is 22 Million letters that have been sent.

Each letter I've received had a least one A4 sheet within it and of course the envelope, so lets call that 2 sheets of paper, which means 250 letters per ream of paper. So we are using 88 000 reams of paper. Then based on the estimate that each ream of paper uses around 6% of a tree we can estimate that this whole election debacle in this month alone, purely on direct mailed propaganda has consumed 5280 trees.

What a freaking waste! And this doesn't take into account all the other printed election material that piles up all over the place, the how to vote cards that nobody actually uses etc etc.

Monday, November 22, 2010

Domain.com.au is just Stupid!

Or perhaps it is the Real Estate agents managing the ads. Either way it is a frustrating, annoying and above all stupid.

Ummm, perhaps I should explain what I am talking about.

I am currently, sort of,  looking for a place to buy in Melbourne. I am attempting to do the sitting back and waiting to pounce thing, which basically means that I have signed up to alerts from the various real estate sites  including domain.com.au. So today, as is usual for a Monday morning, I am sent an email with "New and updated" properties that meet my criteria. In todays email a good looking property popped up.

In the right price range looks to be about the right size and close to a good shopping area and railway station.

Sweet!

So I click through to the page for the house, looks reasonably promising from the site of that. But I notice that there is sale type or inspection times. Thats a bit weird, no matter I'll do my usual things, loading up Google Earth and Street view and checking how it looks, calculating the land area, do a quick search for anything about the house.

It all looks pretty good, so I inquire when the inspections are on. I actually get a call pretty quick from the agent.

It was sold yesterday at Auction.

What the fuuuuuuuuuu...............

Why the hell am I getting it highlighted to me as an updated property? It sure looks like the update should be that it has sold! But it would seem that the update is simply removing the sale type and date etc. If I'd gone there and seen the auction date as yesterday I wouldn't have bothered looking any more. Assuming they just hadn't updated it yet.

Why wouldn't you just delete the whole ad? Or update it to say sold! Which is what they have done on realeastateview.com.

I notice that Domain isn't the only site that has done it is is the same way on realestate.com.au, that is no sale type or date shown.

Ahh I feel better now, I'd just wish they'd sort the site out (but I will admit that they do have a lot of problems to fix....).

I've made 23 cents

So far with google Ad Sense I have made 23 cents.


Looks like my vague hopes of one day quitting full time work and living off my blog are a long way from fruition. At this rate I will not even get my first payout for another 163 years!




Total estimated earningsA$0.23

Media Rover Potential iTunes Library Syncing Awesomeness

Thanks to a post on Gizmodo I found out about Media Rover which is a free program that syncs iTunes libraries from mulitple computers with each other via a network shared drive. I won't go in depth into setup information here, you can find plenty of articles on the Media Rover Blog and help on their site.

So now that I've got the Synology Disk Station 211J setup at home, the first thing I did was setup a shared drive so that I could use Media Rover (Synology Disk Station 211J Setup Part 2 with info on shared drive setup). First I set it up on my PC, I simply followed the instructions as linked above and everything was cool. I had a couple of errors when syncing but most of them were for video files and the remaining few where corrupt music files (that I can hopefully recover from the old laptop).

The next step was to install it on my ancient iBook (circa 2002) and see how it went... not so brilliant is probably a good summation. It doesn't have OS X 10.5 (in fact I am not even sure that it'll run Leopard, which is a moot point because I don't even own it!), so it is going to stay stuck on Tiger. Anyway, as I am really considering what to do with it (at the very least a clean install) I decided I might as well copy all my music off it anyway. It was about halfway through this process that I realized I'd probably already done it in the past. So I cancelled the transfer and sure enough everything had already been copied across. Looks like there won't be any syncing to the iBook.

However, perhaps I was trying a little to hard to get that going anyway, after all I don't really use the laptop that much anyway, I am probably better off getting rid of iTunes on it and just using a simple program to play them off the shared drive on the NAS anyway.

Onwards to my fiancĂ©es oldish windows laptop, which is absolutely running like a dog at the moment. It is definitely in need of a clean up and reinstall. There is however some music on there that hasn't been synced with the main library, so I decided I would try and get it going and get all the tunes off it.

What an abysmal failure that was!

After spending an hour getting it fired up and Media Rover installed the iTunes install turned out to be corrupted. So I've reverted to getting the last of the songs off manually, some of the files are corrupted apparently, and I am now getting it ready to be wiped, hopefully I can get that done this weekend. Its going to take a while to transfer stuff off though because the LAN port doesn't want to work properly so I've got to use wireless.

On to the final computer, the Power Book, the main reason for wanting to get Media Rover setup. The installation again was quite simple but when I went to run the first synchronisation it is telling me that it
Could not determine the iTunes version.
Bugger! Closing everything and then reopening Media Rover then presented my with another problem.
Failed - could not mount server
Double Bugger! So I reran the wizard to see that that would do. It connected successfully and synced the song that I had in the local iTunes but then when I changed the settings to download all the songs from the NAS and then the Powerbook died, the dreaded grey screen of death.

After rebooting I again fired up Media Rover and again it failed to connect to the server. I suspect that it doesn't like using numerical addresses for Samba shares as it was picking up the share on my desktop fine but would not detect the NAS. I ran the wizard again and managed to get it to copy all of the songs across to the laptop and get every synchronised.

However I am not comfortable with the stability of Media Rover on Mac to have it running continuously.  Or at this stage to use it regularly on the Mac. Thankfully I don't really shift music around or get new music that often so it should be alright. I'll keep an eye out for updates and see if I can provide some useful information to the Media Rover team.

Overall, I think the software has great potential, however it does need some work at least on the Mac side of things.

Sunday, November 21, 2010

Synology Disk Station 211J Setup Part 2

As promised in my earlier post it is time for part two of my Synology DiskStation 211J setup review, continuing on from where I left off my next thing to do was setup shared folders so that, among other things, I could use Media Rover for backing up and syncing iTunes between my desktop and laptop.

I attacked this the same way that I usually attack these things, poking around in the control panel and seeing what jumped out at me. I suppose I should start with the interface actually, once you have everything setup with the Synology Assistant as I mentioned in part 1 the rest of the administration is done via a snappy web interface. It feels like a simplified linux desktop, which is exactly what it is really, just on a web page. There are four main icons, the file browser, control panel, help and quick start, which just links you to things in the control panel.

I started by setting up a shared folder, just opened up the control panel, selected shared folder and followed the prompts. Then I ran off to try and play with the Media Rover setup. Unfortunately while it did allow me to setup a shared drive I couldn't access it properly, I hadn't even given it thought but I forgot to setup a user, I just set everyone to read only. But then I couldn't access it with the guest account and it seemed stupid to use the admin account.

So back in the control panel I added a user, again just following the obvious prompts from the control panel, and tried to assign permissions to the folder I created. I thought I did it properly but then from my PC I couldn't access it. At this stage I figured I might as well just make a new shared folder and assign the user properly from the outset, that worked straight away. So I guess the lesson from me here is to add your users before you try and do anything else really. The new folder worked perfectly (although the old one hung around on the Windows PC till it was restarted, I couldn't access it but it was still there) and I was able to setup Media Rover and have it sync across to the NAS with out a fuss. You can read more about the Media Rover setup over here ... when I finish the post that is.

My next step was to install and configure the Windows backup tool, Data Replicator 3, on my desktop PC. Installation was a breeze, run it off the disk and you're away in about ten seconds it was all done. Configuration was a matter of selecting a shared folder on the NAS, selecting which files I wanted to back up and clicking go, very straightforward really, at least to do the initial backup. With approximately 55 Gb of data the initial back will take around 4 hours or so, well that's what it told me anyway, but it did go a little quicker than its progress bar would have had me believe.

I also wanted to setup the Time Machine Backup from the Power Book, I had to add another Shared Folder and then in the Win/Mac/NFS setup, under Mac File Service its a simple matter of selecting which shared folder you would like to use, in my case the ingeniously named "TimeMachine" folder. Somewhat annoyingly selecting a folder requires that you restart the network service on the NAS which I would have to wait to do as I am still replicating the Desktop PC. Once I applied the settings my Powerbook quickly picked up the new "TimeMachine" folder on the drive and it was a simple matter of clicking start to get it all working.

I grabbed a power meter from Aldi the other day so I thought I would put it to use monitoring the power draw of the NAS. On power up it power usage peaked at 24 Watts and then settled back to around 15 W when the drives where spinning but nothing else was happening. When copying an individual large file power usage spiked to 16.5 W, when completely idle and no drives running the power draw dropped to 7 W.

In terms of transfer speeds I can't be bothered sitting down and calculating heaps of them, although I am sure a lot of you are interested in that. A 1.28 Gb file transfered comfortably in around 3 minutes and a 55 Gb backup of my Windows PC via the data replicator took 2 hours and 45 minutes to complete. Not the most scientific but I rate it plenty quick enough for me.

I have to rearrange a cupboard to get the system setup in its permanent position and when it is in there I will get it backed up to the external disk as well, the Story Station.

EDIT: I've now got the external drive connected and backing up, you can read the setup info in my new post.

Friday, November 19, 2010

China stole my packets

I read in Dark Reading today that researchers in McAfee had discovered that:
"At 15:54 GMT on April 8, 2010, McAfee detected a routing announcement from China’s state-controlled telecommunications company, China Telecom, which advertised 15 percent of the world’s Internet routes. For at least the next 18 minutes -- up until China Telecom withdrew the announcement -- a significant portion of the world's Internet traffic was redirected through China to reach its final destination."
Basically for 18 minutes a great portion of global internet traffic went through China, what was done with the traffic while it was redirected is anyones guess. They could easily have sniffed packets, performed Man in the Middle type attacks or simply inspected the contents of all the packets and potentially gotten sensitive information.

The basic problem is that essentially all IP protocols and routing protocols are designed for fault tolerance and openness. Security for the most part is a secondary consideration in there design and it is only in recent years that the focus has shifted to include a strong emphasis on security.

The fact that a malicious (or idiotic) person with the right access can cause the majority of the internet traffic to change routes is a massive flaw in BGP and other protocols, but it is also a strength, a few simple commands can redirect traffic around a broken node or link in a matter of seconds.

Moving forward the balance between fault tolerance, reliability and security will be one of the biggest challenges facing network engineers working at the elite level of global backbone routing. I can only see such attacks continuing to grow in severity and frequency over the coming years.

Oh man, apparently I am Blue Collar...

What a blow to the psyche!

I read  Delimiter – Australia’s blue collar ICT challenge the other day, but I guess the total effect didn't really sink in, while I aspire to the white collar roles and the extra pay etc that are associated with them the reality is that I am firmly entrenched in the Blue Collar column.

Bummer!

I guess like most of the other people around I tended to think of the stereotypical unkempt engineer sitting in a darkened room, a couple of degrees on the wall behind him crunching code, but the reality is that most of the ICT employees are out there are doing in fact doing things like support, or installation etc all things that are more blue collar type roles. Even if not the traditional blue collar type factory role that people think of.

Do unto others as Carnegie would do unto you

I've finished "How to win friends and influence people" now and as far as I can tell, the basic premise is thus.
Do unto others as you would have others do unto you.
Yup, thats right, the core of Carnegie's book is that by pleasing someone, by praising them and complementing them you can get them to agree with you, to do whatever you want, to be a friend for life.

The book goes through a series of "Rules of Engagement" in several separate parts and relates how these rules have worked for various people and how to apply them to everyday life. While some of them seem contrived the book itself is quite easy to read and shouldn't take too long for someone to breeze through it. The book is broken up into six parts.

This first part, Fundamental Techniques of Handling People, is summed up as follows be nice, never criticize and make other people want the same things that you want via sincere and honest appreciation.

Part Two, Six ways to make people like you, gives you a few ways to win people over to your side, to make them your friends, basically five of the six ways can be stated simply as be a good listener and focus on what the other person wants, the remaining principle is to smile. But the over arching theme is that when you do all of this you must do it sincerely, you have to make yourself want to be interested in the other person.

Part Three, How to Win People to your way of thinking, is focussed on debate and reasoning, more specifically how to get people to agree with you. Carnegie here gives twelve rules for winning arguments and people over. The first is don't get into an argument to begin with and this is a great point, once you have descended into an argument you have already lost. The second, which ties into a couple of the other points is, respect the other persons opinion and never ever tell someone they are wrong, they'll end up getting pig headed about the whole thing. This ties into being friendly, letting the other person do most of the talking and being sympathetic to their ideas and desires and viewing the world from their point of view. The remaining principles focus on getting the person to think that your idea is their idea (that way they want to win the argument), challenging them and appealing to nobler motives. The final and perhaps most important principle is to admit when you are wrong.

Nice ways to change people without giving offense or arousing resentment, is the next part of the book and focuses on changing other peoples behavior and attitudes. Carnegie again supplies some rules and these can be summed up as follows. Always begin with praise and keep giving it, never single out a persons mistakes except your own and ask questions instead of giving orders to make them want to do what you want. The other interesting idea is that by holding the other person in greater esteem they'll strive to maintain that reputation.

From here the book trails off a little, Part 5, is a series of letters from fans of Carnegie's lectures, which the book is based upon, and the final part focuses on making your home life happier. Basically you need to do the same thing at home as you do in the rest of the world, that is use the suggestions from the first five parts of the book, oh and you need to be good in bed (yup, you read that correctly). To me this final section is the only one that really dates the book, it was originally published in 1936 after all, suggestions like a 'wife' should keep the home interesting and attractive and vary the menu so dinner time is interesting will probably raise the ire of some people these days but don't let that detract from what is overall still a good book on human nature and I would say a must read for someone that wants to do well in the world of business and life in general.


I would definitely recommend reading this to anyone that has difficulties dealing with coworkers or trouble getting people to come around to your way of thinking.

 If you have any suggestions for other books in a similar vein can you please post a comment here or on my earlier post, On a voyage of self discovery that lists a few books I'm going to read, when I get around to it.

 If you want to help me earn my first ever Amazon associate money then you can use the link on this post. Alternatively the book is in pretty much every second handbook shop worth its salt. I had a request for an audio book link, I didn't have much luck on Amazon but you can try yours with this Search Amazon.com for How to win friends and influence people audio book.

Thursday, November 18, 2010

Vetta RT 77 Cycle computer mini-review

Ever since my old cycle computer fell off on my commute home in the middle of winter I've been on the lookout for another cheap cycle computer to replace it. This time I wanted cadence as well as apparently this is an area that newbie riders should focus on a bit.


I bided my time so for a computer that came along in my price range but unfortunately no free ones presented them selves so I eventually settled on an a Vetta RT 77 which I picked up from Torpedo7.com for the princely sum of $17 and it is still avaliable for that price.

It is basic but has the following features:
  1. Current speed
  2. Average speed
  3. Maximum speed
  4. Speed comparator
  5. Cumulative odometer
  6. Trip odometer
  7. 12/24 hour clock
  8. Ride timer
  9. Service timer
  10. Cadence
  11. Average Cadence
  12. Maximum Cadence
  13. Freeze frame memory
  14. Auto start/stop

 Which are pretty much all that a person needs. Mounting ti was easy, just a matter of liberally applying the cable ties that came with it to fasten it all the the appropriate places and keep the cables nice and tidy, the only annoyance is that the cadence lead was quite a bit longer than required so I've had to curl it up and tie it to the down tube, but other than that its pretty good. The manual is a little confusing but really how much do you need it for installation if you know how the computers work, a simple sensor mounted on the frame/fork that is triggered by a magnet mounted to the spokes/pedal.
I've been using it for a couple of weeks now and its pretty good, although I must have missed the average cadence feature, will have to have another play with it when I get home to see where it is. Up until now I've just been leaving it on the cadence display that shows current cadence and current speed. This is great for me as I am working on increasing my cadence.
Also, this seems to have a better clip arrangement than my last computer so it shouldn't fall off as easily.
Overall I am very happy with my purchase and I hopefully this one won't fall off in the dark where I can't find it.

Atlassian's Performance Review Experiment: Employee Rewards and Training Part 1

If you haven't had a chance or don't follow the Atlassian blogs like I do head on over to Atlassian's Big Performance Review Experiment and have a read and while you're in a reading mood check out my take on Google's 5 Engineering Management Mistakes.

I find Atlassian company growing on me day by day, if I still lived in Sydney no doubt I would have applied for a role(s) there by now. They are obviously putting a lot of thought and effort into employee development and utilizing the same brilliant thinking that led them to become one of Australia's software success stories. For anyone that suffers through the usual crap that surrounds employee training and motivation this sort of feedback structure is totally unheard of, maybe you get a yearly review, or you only get feedback when you cock up something really really badly or maybe you don't even get a review. Your training probably consists of the occasional certification course and any reading you happen to do on the way to solving a problem and you certainly don't get personal development training. I'd love to have these sort of structures in place and adhered to in all the companies I work for.

I was rather fortunate in that I had a manager that wanted to try and develop systems like this where I used to work and he battled hard against the red tape and time pressures to try and achieve it. I was also fortunate to have a manager that, above all else, was more interested in proper training and development that served MY interests ahead of the companies. While his secondary goal was always to help the company his primary focus was to turn his staff into well rounded engineers that would be well equipped to move forward in their careers, no matter where they ended up using the skills.

However, even in this environment, perhaps because it wasn't embraced fully by the company there were problems. The one that I had personally was that I changed projects to a very complex project, the sort that has a learning curve shaped like a step. Basically even though I was familiar with how to use and even debug the product quite well when I moved into a development role I was thrown in the deep end. Perhaps if I was in the main offices there would've been better training support but as I was in a satellite office I relied on a few people in my team to teach me my way around the code base, the development tools that surrounded it (all hand crafted for the 30 year old code base) and all the other paraphernalia associated with developing on a a mission critical product that had been around in various forms for a long long time.

Now I am not for a minute saying that these guys weren't supportive, its just well, that they didn't know what I needed to know, they'd all been working on the product for a while and hadn't captured all the simple gotchas that they avoided without thinking now. Eventually I muddled my way through and became a useful member of the team again, but it took me a couple of months to get up to speed. During that time I will admit that at various times I lost my desire to come into work, I lost the motivation to get my head around the whole thing a few times and found my self wadding through the detritus of the internet instead of focusing on learning and building my knowledge.

It got to the point where I'd been hassling the other team members so much I didn't feel like I could interrupt them any more. After all they had their own work to do and I was holding them back. At this point I felt alone, I tried to express my frustrations to my manager but couldn't quite get them across as well as I wanted to. I'd tell people it was really hard and they'd agree with me, say that it was difficult and push through.

Eventually I did push through and worked well (albeit still slower than other team members) for a couple of months before performance reviews came around. Now, based on all my feedback I thought I'd been doing alright, sure it'd taken a while for me to pick up the really complicated project that I know worked on but overall I felt pretty good, I didn't think I'd nailed it like previous years but I hadn't failed miserably either.

Well at least thats what I thought, unfortunately the management team obviously thought I sucked, I got the worst review I've ever received, missed out on a bonus and over all I was mighty pissed off. The main reason, I didn't know I was at the bottom of my peers, for a year I slaved away but no one ever took me aside and said to pick up my game.

The problem was that in the end the performance metrics used where focussed on things like features implemented, bugs fixed etc, there was no recognition of countless hours spent trawling the intranet looking for answers to my problems, no recognition of the mountain of technical documents I had to read and understand (like a design document written before I was born!)

It seems to me that this sort of problem wouldn't happen at Atlassian, or at the very least would get picked up a whole lot sooner and that with lots of 1:1's and a focus on staff development they would be able to work through it to ensure that the employee didn't come out of the whole process feeling ripped off. The whole thing still annoys me several years later and my guess is that it will continue to annoy me.

Wednesday, November 17, 2010

Stuxnet is like an Onion

On the outside appearing ogreish in its impacts on systems and devices but after careful research revealing that it is a devilishly beautiful piece of extremely targeted malware designed it would seem to cause Uranium Enrichment centrifuges in specific Iranian nuclear plants to break down.

Tuesday, November 16, 2010

Synology Disk Station 211J Setup Part 1

So, as you may have seen in my earlier post I've finally gotten around to getting a backup solution in place for my house. Now I don't have loads and loads of crap that I have downloaded, nor do I generate a tremendous amount of data at home anyway. However I do dabble with a bit of photography, have a music collection that has grown over the years and is now spread across 3 laptops and a desktop and I am starting to gather those important documents that you don't really want to lose on drives spread around the place.

So, lets get down to it, here is what my solution will consist of:

Saturday, November 13, 2010

Finally getting a back up solution!

Misc Oct 2010 026
I got some new toys today from my friends at Scorptec. These are going to form the proper backup solution that I've never had before. I got an awesome deal with them setting up a promo special because I asked nicely on the Overclockers Australia Forums (if you're a member you can check out the special in this post). Here is a list of what I've got:


Hopefully I'll get the full review up in the next couple of days.

EDIT: The full review starts in part 1 and continues in part 2 and then in this post about backing up to an external USB drive.

Friday, November 12, 2010

5 Google Engineering Management Mistakes: Employee Rewards and Training Part 1

With all the kerfuffle around the interwebz of late around Googlers running away to Facebook and more recently the 10% payrise story I found this presentation, 5 Google Engineering Management Mistakes, informative and timely.

I was actually going to write about it before the 10% payrise story broke but didn't get around to it, actually the payrise bit just makes the whole presentation more interesting, basically it is about how Google trains and rewards its Tech Leads/Managers and the reward structure that they use across the company.

Things like 20% time and free soda etc have long been talked about as good reasons to work at Google, but this presentation actually focussed more on the negative issues surrounding employee rewards. Employee rewards and renumeration are a tricky area, simultaneously they are among the biggest cost a business faces and the most powerful tool for capturing and retaining quality staff. (If you pay peanuts you get Monkeys).


I want go through the whole presentation here, but I did want to highlight a few pertinent points.

  1. Managers need good training.
  2. Team Leaders should meet with staff 1:1 every couple of weeks.
  3. Team Leaders where rewarded more for individual work than for getting the team to do good work, not enough recognition of managing team as good work.
  4. Recognition is at least as important than monetary rewards.
  5. Promotions system could be gamed and lack of promotion for some people lead them to leave.
Managerial training had never really entered my mind until my old  boss got to the personal development part of his MBA that he was studying part time. Suddenly our coaching sessions became more frequent and more meaningful, suddenly I was getting asked hard questions about career direction and training and a whole bunch of stuff that I never thought about as a recent graduate. It was annoying sometimes but it was also very good. This flows straight into the second point above about team leader meetings, a good manager can't coach you and provide good feedback if you never meet with them.

I've got some more stuff I want to post about training and staff incentives that I'll do in a later post (that I am yet to write), however I thought that the Google presentation was a good introduction. I've also got some thoughts about a disappointing phase that I went through at work, that is directly related to training and reward and how I ended up feeling very short changed.

'Which' Banks latest thinly veiled attempt at improving customer satisfaction

Following the interest rate rise above the RBAs hike last week.

Today my fiancĂ©e received an email from the Commonwealth Bank.
Hi Toby's beautiful fiancĂ©e, 
It’s CommBank employee from the commonwealth bank, we spoke a couple month ago regarding home loans. I hope this email finds you well. I’m just sending you a courtesy email to see if everything is ok and if you have any home loan related question that you would like to ask me. Keen to hear back from you. Cheers.
Maybe its just the cynic in me but they didn't care what we thought before they got buried under a steaming heap of bad press. Perhaps they have changed there ways and we'll get another follow up in a couple of months to 'see if everything is ok' but I seriously doubt it!

Simple Windows Load Testing

Today a coworker asked me to recommend him a simple tool for load testing a specific URL. Now, there are plenty of tools out there, but most are written for Linux or overkill for a simple task of spamming one URL to test performance. In MSDN land the tool most mentioned seems to be WCAT which is a very powerful tool but its like using a nuke to open a Walnut, an example of one of the possible configurations is shown below.



As you can see, it can be used to control a bank of client machines from a single controller. And to use it you 'simply' specify a bunch of parameters (id, url, verb, redirverb, statuscode, redirect, cookies, secure, handshake, protocol, algorithms, version, postdata, close, authentication, username, password, server, port, handler to name a few) in a script file. Easy right?



To be honest I looked at the documentation and panicked slightly, surely there is a better way I thought and thankfully so did someone else, someone that had a bit more time on their hands than me, they come up with a Fiddler extension which makes it dead easy to use. (If you do any web development stuff or just want to check out what HTTP traffic is going to/from your computer you should check out Fiddler, its Wireshark for HTTP.)

Install everything required and then open Fiddler and query the site you want to connect, high light the sessions you want to use in the load test and select the run WCAT option. That'll use the default settings shown below, which for the most part will be pretty good. But these can be changed easily as shown on the blog post about the tool.



You can then view the results in your web browser, which gives simple stats like requests per second.

Hope you find the tool useful!

Monday, November 1, 2010

Smart meters need smart stats

Apparently my street is about to get updated to having Smart Meters, at least according to the sales guy that knocked on my door the other day.

Now, depending on where you look they are either our environmental savior or a sadistic cash grab out to financially destroy the poor and huddled masses. The benefits for the power companies are obvious:

  1. No need to send people around to every house to look at the meters
  2. Near real time feedback on network demand allowing tighter control of supply
  3. Ability to remotely connect and disconnect power to houses
  4. New tariffs are much higher than existing power costs
However, the benefits to the consumer and the environment appear to be lacking. The potential benefits are obvious and focussed around real time feed back of energy use. Quite frankly it is hard to determine how much energy is being used in your house at any particular moment in time. Even after studying electrical engineering at uni I couldn't tell you if having my induction cook top on high uses more power than my air conditioner for example.

If the consumers are empowered by real time stats, if they can see the difference in energy usage that adjusting the thermostat by a degree can have then that is where the real benefit of smart meters will be had. That is basically that

Smart meters need smart stats

But the power companies actually lose out if they provide those stats to the customer, think about it for a minute. Not only are they forced to pay for the infrastructure costs required to collect and display the information but they also lose out because and informed and educated customer will use less electricity. It is counterintuitive for them to provide the information, it is obvious then that a third party provider needs to exist to collect and process this information.

I am yet to see any indications that such an arrangement will ever happen but I am sure that we'll get there in the end. However, I was greatly pleased and encouraged the other day when I found Smart Energy Groups in my daily twitter wanderings. It is a Melbourne based group that has just starting to sell the Seg Meter.
An internet enabled multi-channel energy meter that measures energy usage, and connects to the smartenergygroups.com website.
Basically it does everything that a smart meter should, allows the user to get instantaneous energy usage feedback on a range of devices computer, smart phone etc and even has channels so that you can set it up to tell you what individual energy circuits in your house are using.

The device is designed to be modded and improved by the users which will no doubt lead to some awesome customizations and additions from the hacking community. While it is too early for this device to be widely used by Joe Public it is a step in the right direction. And I am seriously considering how to to get one of these working in my home. If it wasn't for the possibility of a move shortly I probably would have already ordered it.

In the end, unless the power companies get an advantage from providing the information to the consumers there will be no incentive for them to do it, why would they when they get all their benefits upfront and such information will just harm there bottom line!