Sunday, January 05, 2014

Second Thoughts on Having a Personal NAS

A year ago I finally took the plunge and joined Amazon Prime. What a happy prison it is. Good discounts, fast shipping, and lots of incentives to buy an Amazon Kindle tablet. But that's not really what I wanted to write about. It's fallout from being in the happy prison that has caused me to question whether my approach to having a personal NAS is a good idea.

So here's what's happening: I'm now buying a lot of ebooks from Amazon. I've got a Nook HD+ tablet so I also buy them from Barnes and Noble. I also have found my way on to some nice free ebook mailings. And I have digital magazines on Zinio. And comics on Comixology. And more ebooks on Steam, and some on Humble Bundle, and some more from Groupees and still more on BundleHunt. I also have a few loose ebooks on my local drive, managed by Calibre.

Do you begin to see the problem? In a world where technology is supposed to make life easier, I now have several more accounts and passwords to remember, and the sad truth is I'm probably not going to read but half of those ebooks, and that's being optimistic.

"So wait," you ask, "isn't this exactly why you got the NAS? To put all that content in one place and be able to access it from any device?" Well, sort of. The effort involved in transformation of that data from the commercial cloud to my personal cloud is sort of a pain in the ass. It's more effort than memorizing ten passwords.

When I use Amazon's cloud service for storing my MP3's or Microsoft's SkyDrive or DropBox for a commercially provided network storage, it's really convenient. Security, infrastructure, capacity and maintenance are all someone else's problem. I do get the point of the personal NAS: I have full control of my content and if Amazon goes out of business (unlikely) or Microsoft decides to pull the plug on SkyDrive or change it into something else (less unlikely) then my content is still safe on my own hardware. Not to mention that if any of the data is sensitive such as client information, it's better on my own device than on someone else's.

But for non-sensitive materials, I'm not sure having a personal NAS is really that big a deal. I love the Synology Diskstation I have, but it wasn't free. And it's not free to maintain, although as you've learned from my last several entries, harnessing additional functionality was really cool.

I think what I need is for someone to write a consolidation app that pulls all of this together. In the meantime, I've got a Frankenstein of a storage approach. And you know what? Even with all their problems, the happy prisons that Amazon and Steam give me for all those books, music, and games are awfully comfortable and I'm glad to have them.

The Best Bonus I Ever Got

I know I complain a lot on the blog about IT management. Well, in my opinion, IT management asks for it. Just like lawyers do when they send our society on a downward spiral to hell on riptides of lies and blame deflection.

But this post will be different. I promise. Today, I'll talk about the extra bits of cash compensation employees get outside of their base salaries. These have been far and few between in my career, so maybe this will also get to be a blessedly short post. Apologies again in advance, for some salty language that might follow.

The first bonus I got didn't come until about five years into my career. A lot of that had to do with the crappy company I worked for, but a lot also had to do with me being an inexperienced and poorly managed resource. Anyway, it was a day cruise given to my team for working hard. I appreciate the gesture, but it was on the lame side as rewards go. And I didn't like how only half the team got to go and in retrospect consider this a managerial mistake. Some of the newer team members weren't included (in what I would bet was a cost-cutting move). I felt that was not a smart way to handle team morale in an effort about raising team morale. But it's the thought that counts, so I count it as a bonus even though it sucked in more ways than one. Shit, sorry, I was supposed to be positive in this post. I'll try harder on the next paragraph.

The next bonus was much better. It came in my sixth year with my first company. I had moved to a new, smaller, team and I was doing a much better job of being useful as I'd become more experienced. I also had a more laid-back supervisor and a pretty reasonable manager. My team received an end-of-year bonus of about three thousand dollars. Not enough to buy and island and retire, but nothing to sneeze at either. What is so damn goofy is that I worked less hard for that bonus than I did for the day cruise.

I switched to contracting for a while and bonuses are typically not part of the compensation structure for hourly employees, so there's nothing to report until I switched back to full-time work about 1999. Then I got a variety of bonuses. An annual performance bonus could be between two thousand to five thousand dollars. A spot performance bonus I got was three hundred dollars.

I bounced back and forth between full-time and contracting for a while after that but didn't get another bonus until I was again full time and had a manager that appreciated my work. I killed myself for more than a year straight of overtime and got a spot bonus of a thousand dollars.

I think it's fair at this point to note some lessons I've learned about bonuses. Your experience may be different. In fact, I hope it is. I hope you've done significantly better.
  • Bonuses are usually but not necessarily tied to company profitability
  • Bonuses are highly dependent on your immediate superiors and their superiors
  • Bonuses are a very subjective thing.
    • At one company I got almost no bonuses until the end, and I was working less hard than I did in the earlier years. Some employees told me of bonuses they got for putting in a mere hour of overtime. Now that's the kind of consistency that earns employee trust!
    • At another company, bonuses sometimes came with formal recognition in the form of "President's Awards" or "Outstanding Performance Awards". These were REALLY ridiculous. It's not that some of the people getting them didn't deserve them. The problem was that the significance of the achievements earning these awards were all over the map. Some people got them for working hard on a specific important project, even though the teams on that project might have had several deserving people. Or two people might get awards for working on different projects, even though one was a multiple month or multiple year effort and the other was a one week commitment. It all came down to who had the manager that liked them, and in the end, I think this hurts morale more than it helps. Getting no recognition really hurts when you give your heart and soul for a long time and when you really make a difference. I'm not sure what the answer is for this bullshit though because for the people that deserve it, it is nice to see them get something.
  • Don't depend on bonuses. They're not guaranteed. Hold their feet to the fire in salary negotiations. If you get a bonus, great, but either way you will get the salary.
  • IT shops are pretty barren when it comes to bonuses especially when the company treats IT like a cost center. For sales and a few other divisions, bonuses may be a more legitimate part of the compensation structure.
  • If you want to work in IT and get bonuses, find IT shops in companies where an annual bonus is universal to the pay structure. For example, one of my clients was a trading firm, and everyone, even IT, got significant bonuses (like 20-40% of the salary, a concept that is completely alien to me!).
So which of the bonuses above was the best one? I am thankful for them all, but the answer is, "none". The best one didn't come from management, it came from my users. One of my clients had a legacy system that had (and still has) a terrible user interface. They were suffering greatly on having to enter data one row at a time, spending multiple man-days of effort each month. I added a simple import capability so they could massage their data in Excel and then import it through cutting and pasting. Did it work? A few weeks after the feature went live, I got this from them:

That's right: a modest $25 gift card, for a place that makes food that's mostly not on my diet. It's the best bonus I ever got. Why? Because as Jeff Atwood would say, it showed that people were using my software. It showed that my work improved lives. What makes this bonus great is not even the $25, but the kind comments from my users on the card it came with. 

Now I'm sure there are managers looking at this and saying, "Gee what an asshole that Bernard is. How could that meaningless shit be worth more to him than a thousand dollars?" Man, if you're a manager saying that right now, I pity you. You have completely missed the boat on how to do your job and how to be a leader. And I pity even more your subordinates.

Oh shit, I'm supposed to be positive! Ah, ok, well, I took the card and had a nice date with my wife, eating wings before a movie.

And for any overly literal pinheads reading this, no, this doesn't mean I don't appreciate monetary bonuses. But really, this kind of recognition is truly special and particular to software developers in the same way that a compliment to a chef or an artist means as much emotionally as the money. The chef gets a paycheck either way, but if he knows his clients were enriched by his cooking, he has a sense of purpose fulfilled. And this really is where IT management really needs to get a better understanding of how technical people respond to feedback.

We really don't give a shit if you praise us for good attendance or being on time to meetings. We do like pizza, but throwing a pizza party isn't really doing much for morale. When you use metrics like how many SOX audits we passed or how little we were penalized for dress code violations, you're just drawing attention to the parts of the job that suck. 


Wednesday, December 04, 2013

Craftsmanship is Dead

Sorry in advance for the negativity but I just encountered something that has really pissed me off. Also, apologies in advance for salty language, but what I'm about to show you doesn't deserve professionalism. Again: WARNING, FOUL LANGUAGE FOLLOWS.

I complain about software developers misunderstanding their job as "easy" mostly because they're lazy bums that don't want to do a complete job. But developers aren't the only half-assing bastards out there.

I'm still living in my first home and I am regularly appalled at the shortcuts I've seen the builders take in making this house. There are places where the most fundamental of building constructs, the 90 DEGREE RIGHT ANGLE, were built incorrectly. Good grief, how stupid do you have to be to mess up a right angle?

When I look in the attic, it's just a head-smacking collection of hacks and patches. And I paid money for this?

Yesterday I replaced a garbage disposal unit that went bad after 14 years. This Whirlaway 191 1/3 horsepower unit is not what pissed me off. It was dirty but 14 years in a day when everything is disposable is not bad, and really it was just one of the blades that went bad, the rest of the unit probably could have soldiered on a couple more years.

But the part that got me was when I went to remove the existing power cable so I could transition it to a replacement Whirlaway 291. The power cable's ground wire had not been fastened to the ground screw on the old unit. Wow. Really? The guy that installed this was that lazy? He just cut the ground wire so it wouldn't be in the way, connected the other two wires, and plugged it in. Not a care in the world about any possible power surge or electrocution. Fucking asshole!

And it gets better! When I took a look at the plug to verify the larger prong so I could trace it back to the exposed wire and connect it to the proper wire PER INSTRUCTIONS, this is what I find:

Yep. Take a close look at those prongs. Just look at it. Yes, that's right, the lazy fucker filed them down so he wouldn't have to worry about whether he had the right wires connected. FUCK ME. I thought this kind of bullshit was supposed to exist only in the realm of Tim Allen jokes.

Yeah, I know, there are dozens of professional electricians out there that will say, "Oh, you're making too much of a big deal about it. This is low voltage bullshit that won't hurt anyone."

Except...THIS ISN'T ABOUT VOLTAGE. This is about doing a COMPLETE, THOROUGH and CORRECT job! This is about the simple task of following instructions handed down by the professionals that made the garbage disposal unit. Was it really that hard that the contractors couldn't do it right? Would it really have taken that much longer to do it to specifications? To be, oh, I don't know...SAFE?

Here's the kicker...the guy that did this is not just lazy, he's stupid too. Because he didn't file down the ground prong, filing down the other prongs doesn't make a difference...the ground prong forces you to put the plug into the socket correctly. So I'm left to think this guy filed down the prongs out of habit.

It turns out the wires were connected correctly, but the evidence points to this being a stroke of luck rather than the product of professionalism and preparedness.

Holy shit. I paid for this shit. This entire thing is just fucking embarrassing. It brings me such comfort to know this house was at least partially built by zombies.

You know, in the past I've withheld names to protect the not-so-innocent. But I can't take it anymore. The only thing that will make me feel better is the truth. I bought this house from PULTE, owned by BILL PULTE, a so-called MASTER BUILDER. The fact is that Bill Pulte never touched my house, it was one of the thousands of contractors he's got working for him that he's never even met (decision-consequence gap, bitches!). So there you go. When you're looking at houses, remember what I told you. But I wouldn't be surprised if other builders were cutting corners too.

CRAFTSMANSHIP IS DEAD.

Saturday, September 21, 2013

The Linux Adventure Part 5: Everything is Broken

Ye gods.

You know all the stuff I wrote in the last five blog posts or so about the NAS unit? Well, it turns out that while it worked, side effects broke some the unit's functionality. First, the unit's network light would flicker and then it would cease to go to sleep, a critical function for energy savings on a 24/7 appliance. Then I also noticed the Diskstation would not shut down or restart when given the manual command to do so in the software.

After much gnashing of teeth and many face palms, I learned that bootstrapping the unit caused the issues. Synology generally is quite liberal about bootstrapping, even including information on how to do it in its official support resources. However, the operating system software, Disk Station Manager (DSM), gets updated regularly. Usually, this is a good sign that shows a company hasn't abandoned its product and is actively supporting and improving it. However, it also means that the environment running the station is a moving target. And by also allowing bootstrapping, Synology has led me into the very bear traps I see corporate IT shops sticking their balls into every day.

Young Profession, Old Debate

In the corporate IT world, there's an established debate about "build vs buy". Do you build the software you need from scratch or do you buy an off-the-shelf package? The debate usually has these points:

Build
  • PRO: Software is customized to your needs and way of doing business
  • PRO: You have complete control of the code
  • PRO and CON: You have complete responsibility for the code
  • PRO: Proprietary business knowledge is institutionalized in the code
  • PRO: Enterprise processes are enforced by the software globally (barring proper implementation)
  • CON: In-house development is expensive
  • CON: In-house development often requires non-software shops to have proficiency in software development (bigger con: in reality most IT shops have at best mediocre competency).
  • PRO: Properly implemented, a custom software team can more rapidly change the software than a major vendor can or will
Buy
  • PRO: Buying someone else's software is less expensive because you don't have to have in-house development resources and licensing is cheaper than development. That's the theory anyway...in reality I'm not convinced it's less expensive, but I agree that writing a check monthly is probably less work than having to account for a staff of developers and all the extra HR overhead.
  • PRO: Buying means you just use the software and the vendor handles development and support. Those of you that know better can laugh now.
  • CON: Your business now does business the way the vendor's software wants you to, not necessarily the way your business people want to.
  • CON: There may be limited capacity for customization of the vendor software to implement proprietary strategic processes
  • CON: Reality shows that for core business (not commodity items like word processing or spreadsheets) buying is still expensive and implementation missteps often negate any cost advantage
  • PRO: Certain industry standard processes, if implemented well, can be a part of the vendor software
  • CON: It is rare that all companies do business exactly the same way
  • PRO: Vendors may be able to capitalize on integration with common third party packages for things like accounting
  • CON: You may think that a vendor is more dedicated to the principles of good software design and best practices and therefore will deliver stable, efficient and intuitive high-quality software just like the kind you find on Apple computers. The truth is that some shops may be very good, but most are made up the same knuckleheads that soured you on the "build" approach.

We Want to Have our Cake and Eat it Too

In the 80's and 90's it was common for companies to take a "build" approach. Software was new and exciting and there weren't many vendor packages available, so the typical IT guy said, "Oh, it's easy, all we have to do is..." and much spaghetti code was born.

Later companies started realizing what a mess it was to maintain crappy systems. Prodded along by drinks on the golf course and rides on corporate yachts, they decided to buy instead. But they made a critical mistake. They still wanted to do business "their" way and that required changing the software they bought. They wanted the best of both worlds but ended up with the worst of both worlds by buying and then heavily customizing.  Oops.

Now they had vendors that wouldn't or couldn't respond quickly to changing the software to fix or add desired functionality. Their in-house customizations did address some of the core software's gaps but built by inexperienced developers, proved cantankerous and bug-ridden and the users hated using them. And there was another issue: when the vendor had a new version, upgrades were that much harder because they could break the customizations. Many would suffer nervous breakdowns during this time, but consulting companies would happily offer help in exchange for a chunk of the company's life savings.

Holy Crap I did it Too!

That brings me back to the NAS unit. I got it initially so I could store my photos, documents and music in one location instead of having them dispersed on four different PCs and dozens of other flash cards and portable drives and memory sticks. I could have rolled my own by putting together a simple Linux server in a low profile case, but I wanted the off-the-shelf solution that would let me plug-and-play. I wanted the benefits of commodity.

But like the corporate idiots before me, I got greedy and wanted this wonderful Linux server to do more. So I bootstrapped. I customized. And I encountered an incompatibility with the latest version of the NAS software that doesn't work when you also have the NAS change the default shell to bash from ash. It caused other commands in the script to fail, so several services such as the sleep function and the audio server and photo server ceased to work. The unit would not even heed the manual shutdown or restart commands. Ugh.

I suppose I'm not quite as stupid as the so called "leadership" which commits to decisions that really create hassles for thousands of people and ultimately cost billions and billions of dollars. My suffering is confined to just me; honest men wouldn't have it any other way. But I do feel some embarrassment at having made a similar mistake.

Sometimes It Takes Two to Mess Things Up

I had help though. For the first year I owned the Synology unit I was pretty happy with it. I still am, when it comes to the basic functionality of the unit. However, when I spend money on something I expect it to work, not to be brittle like the rest of the software out there. The support forums for the Diskstations are filled with people that have similar problems as mine, and some even from folks that did not bootstrap. It appears that the regular updates to the DSM software can be risky, which shouldn't surprise me, but the level of dramatic errors and functionality loss that can occur do. This is Synology's hardware, not mine, so they ought to be able to release a beta that doesn't crush functionality. This excellent thread [serverfault.com] shows though that apparently Synology's developers took several short cuts and made some sloppy moves in building their software and products.

In my case, I had to remove the lines I added to profile config to launch the bash shell and allow it to remain in the default ash shell. Now the Diskstation again responds to the manual shutdown and restart commands. After reindexing, Audiostation is back to working status. However, the sleep behavior is still broken, and now the next thing for me to try is downgrading the DSM software (currently in version 4.3 beta) back to perhaps version 4.2. The DSM front-end software warns before doing DSM updates that the DSM software cannot be rolled back, but thanks to the enterprising user community, there are ways to do it.

I'll be working on downgrading the software. I will probably lose the GUI-based task scheduler since that was part of the 4.3 beta, but I may still be able to access crontab in ash, and create the scheduled job that way. And having the Diskstation off for a bit isn't so bad; it'll be nice not to worry about those Internet pinheads that keep probing the machine. In the meantime I'll continue to use my laptop for some of my Linux needs.

Monday, September 09, 2013

The Linux Adventure Part 4: Putting the machine to work

All right. Moving the vpnc command to the main script has indeed worked, though I do need to quickly get onto obfuscating the password so it doesn't sit visible in the config file.

Once I did that, I confirmed it ran ok, connecting to the VPN, running the database copy, and then disconnecting from the VPN. All ran well.

To automate, I utilized a new feature of the Synology Diskstation's DSM software, version 4.2. There's now an integrated Task Scheduler. Interestingly, it doesn't appear to be a simple GUI interface to crontab. Instead it has some proprietary commands and probably data structures. It's also poorly documented. The help file explains what the various parts of the Task Scheduler screen are but doesn't have Synology's usually good tutorials on operating the feature. DSM 4.2 is in beta I believe, so perhaps the documentation will improve when the final product is out.

In any event, it's fairly self-explanatory to set the custom user job up. You give it a name and enter the command exactly as you would enter it at the command line interface, and then set a frequency. You can also run on demand at any time from the GUI.

I don't like that there's no record of tasks in the logs when you run them. Perhaps that will also be improved in the final release. But as it stands, when you run a task it doesn't look like anything is happening and none of the task's output is displayed anywhere.

You have to run this line from the terminal session and it will give you information on the scheduled tasks and what their last run status was:

/tmp/synoschedtask --get

Aside from the /tmp directory being a weird place to put a main feature, this line will return a list of the scheduled tasks and their configurations, along with a last run time and status.

At this point I have a boat load of error handling and feedback features to add but can now start taking advantage of automation to have this thing run automatically every day.

Saturday, September 07, 2013

Synology Diskstation Security Tips

If you have a Synology Diskstation and you're only using it for local hosting of files on your local lan or wifi network, this may not be as critical.

IP Auto Block

But if you've opened up the unit to accept connectivity from the outside world via the Internet you would be well advised to regularly log in as the admin and review the system logs. I didn't notice much activity at first when I allowed remote access to the Diskstation, but recently I started seeing regular attempts to login from unknown sources. Several each day like these:



It's really unnerving to see that. I'm not naïve but my server is literally a tiny speck of nothing in the universe and has nothing of value to anyone but me. Yet the universe must be filled with pinheads who have nothing better to do than try and hack random IP addresses. Actually, it's highly likely the login attempts aren't being done by a human but by a bot that's already found its way onto other servers and is just probing.

The Diskstations have a feature called IP Auto Block. You can find it in the Control Panel of the Synology server admin tool. Turning it on will make the Diskstation automatically block any IP communication once it has witnessed five failed login attempts. I highly recommend you turn this on. Since turning it on my Diskstation has regularly been blocking several addresses each week. Ye gods, is there no honor left in this world?

Antivirus Essential

Also, Synology offers a utility called Antivirus Essential that you'll find in the Package Center under Security. It's a free anti-virus tool. So far I haven't had issues with viruses getting on the Diskstation but I'd also recommend installing this package to help ward off some potential problems.

The Linux Adventure Part 3: A Path to Success

Installing MySQL on the Diskstation

Continuing from the last installment: there are several items I listed on the to-do list that actually were already done. I didn't need to install MySQL on the Diskstation as it is already on there and enabled either by default or when I first configured it at setup.

Installing Bash on Ash

From the last installment, I had already installed the ipkg package management software. The command ipkg install bash will pull down the bash shell and install it on the Diskstation.

Test Manual Run of Existing Script

In preparation for the script test I first ran /opt/sbin/vpnc . This opened a connection to my client's VPN.

I then copied my existing bash script over to the Diskstation. But ran into several issues when trying to execute it. I had to relearn simple things like making sure the path was a part of calling a script unless it was already in the working directory (pwd shows current working directory).

I also had to learn new confusing things about Linux since a few confusing things weren't enough.

I made bash the initial shell by adding a few lines to the root profile to launch the bash shell. You can find these on the Synology Google hits for "Synology bash" but even after now getting the bash prompt when signing in (bash-3.2#)I still had to use special syntax to run the bash scripts through the bash shell.

In other words, I installed the bash shell, got it to be the default when logging in, but still the scripts will try to run under the Diskstation's native ash shell [Wikipedia.org]. Got that? Thank you so much Linux. So upon initial attempts to run dbbackup.sh I would get beautifully intuitive syntax error messages like this:
Line 15: syntax error: unexpected "("

Using parenthesis when defining variables indicates you are using an array to store the value. I don't know many languages that don't support parenthesis or arrays, but because ash is designed to be super lightweight it doesn't quite have all the same functionality as bash. Here is a thread [busybox.net] explaining that ash doesn't support arrays. You can take a look at the documentation for ash [in-ulm.de]. It's pretty amazing considering its tiny footprint.

But rather than look for ways to reinterpret the script in ways compatible for ash, I found some other people had similar problems. Ipkg installs bash under /opt/bin and you have to execute the bash script almost like a parameter to the bash command. Like so (where the working directory is the location of the script):
/opt/bin/bash ./dbbackup.sh

This now made the Diskstation try to execute the script, but I was not home free yet. There were some calls in the script to commands such as mysqldump and mysql, and both of these also needed to be prefaced with their directory locations. So I updated the script by adding /usr/syno/mysql/bin to the front of those command calls.

Now the script started to run, but I got one last error before it would complete the backup. The Diskstation told me it couldn't find the destination database on MySQL localhost where I would be copying the source database to. So I launched phpMyAdmin (a Diskstation natively supported package) and connected to the localhost server, then simply created an empty database for the backup.

It Works! I've Finally done Something that Works!  

    - Doc Brown (Back to the Future)

Now I ran the script and it worked just as it had on my laptop, reaching out via the VPN to the client database, grabbing the tables it needed, and copying them to the Diskstation. Awesome. Additional tips:
  • The Diskstation's MySQL installation is found at /usr/syno/MySQL/bin
  • The Diskstation's MySQL localhost server stores databases at /volume1/@database/mysql 
Now I have opened up some options and benefits for this task. I have:
  • Multiple ways to back up my client's database
    • mobile notebook
    • desktop at home
    • remotely connecting to the Diskstation
  • Increased my knowledge of the Diskstation and my love for Linux's idiosyncrasies
  • Gained a backup of the backup (the Diskstation utilizes mirroring)
There are a few things left to do before the process is fully automated. I need to add the vpnc commands to the script if possible so I don't have to do those manually. Also, add error handling to that if something goes wrong the script will correctly tidy up after itself, closing the vpnc connection and exiting the script.