Recent Posts

New Year 2015!

After a year of disappearing  from the Web, I’m re-enabled the paixao.ca website. I’ll try to post a little more frequently – but don’t hold me to that. 

In the last year – I started new job. I’m now a Technical Specialist with Cirrus9 – a managed cloud hosting solutions provider  in
Saint John, NB.  And with this new position, comes lots of new technologies – so lots to learn and hopefully share with you.

cirrus9-logo

Come back and visit regularly and see what’s new.

 

New Year’s Resolution – Backups

no-backupMy recurring New Year’s Resolution is to do a better job on my backups. And it doesn’t matter how well (or badly) I did last year – I can do better!

I have my home environment well protected  thanks in part to a cloud based service called CrashPlan.  Read about that here.

But my corporate notebook always provides a challenge.  Copying private corporate data to a public cloud such as CrashPlan  – probably not the brightest idea. And the corporate provided solution is meant for “regular users” that have a few GB’s – not the “power users” with 100’s of GB’s of data.

My goal is simply to backup my data. I assume that, if required,  the installation of the OS and any applications will be done by a corporate imaging process, or even manually with a DVD or an ISO.

Here is my solution – a bit old school – but it gets the job done. 

The target for my backups is an external USB disk. Any disk will do – as long as it is big enough. Doesn’t have to be particularly fast or even portable.

truecryptFirst step is encryption. Yes – ENCRYPTION!!  Stuff happens. I might lose the disk in transit. It might be stolen. I can’t leave myself vulnerable to having sensitive data end up in the wrong hands. My tool of choice for that is TrueCrypt. I created an encrypted volume on the USB disk and I mount it as required during the backups. There is a Beginner’s Tutorial on TrueCrypt’s website that has all the details needed to get started. 

teracopyThe next step is seeding that first backup. Windows File Copy is fine for a few files and directories, but when I am copying 100’s of GB’s, I need something better. The tool I use for this copy is Teracopy. Teracopy claims to be faster than Windows Copy (I never tested that), but more importantly, it provides the ability to start/stop/pause and recover from errors during the copy process. I actually use Teracopy ALL the time – not just for backups.

What data am I going to copy? For me, I backup the Documents Library. (I don’t bother with Pictures, Music and Video Libraries since I don’t have any such content on the work computer). I copy my Desktop, Downloads and Favorites folders as well. And I also copy my VMWare Workstation “Virtual Machines”.

synctoyTeraCopy is great for getting that initial copy. But you don’t want to manually do each of those steps every time. The last utility in my toolbox is the Microsoft SyncToy. This allows me to build “folder pairs” – a source folder and a target folder. I setup a folder pair for each of the directories above.  Advanced options also allow me to determine what to do when I delete a file in the source folder. Do I want to keep it’s copy in the target folder? Or do I want to delete it there as well?

Once it’s all setup – I can run a backup with a couple of clicks!!

But can’t I automate it further?  Well – maybe… If the target location was mounted all the time, a simple “scheduled task” could take care of the backups. But because of the Encrypted Volume – which requires a password when it’s being mounted – this is not going to work. For now, I have a recurring Outlook reminder to start the backup a couple of times a week.

And what about Offsite? I have not gone that far – but there is no reason that I couldn’t rotate through a number of USB disks – and keep a copy on my desk at work, and one on my desk at the home. 

Your turn – tell me how you’ve solved your backup dilemas. 

Zabbix – Open Source Monitoring

zabbixLast spring, at a job interview – the interviewer tells me that they are considering using Zabbix to monitor their environment. I was asked what I knew about Zabbix. And unfortunately, the answer was nothing. They told me to go learn it, and come back in a week and so, I did!

The job never materialized, but nonetheless, I am grateful that I was introduced to this product, and want to share with you a little bit about how I am using it… At home!!

For those not familiar with the Product, the headline on their web page describes it as “The Enterprise-class Monitoring Solution for Everyone”. Well – I don’t know about “everyone”, but for those of us that are used to Linux and other open source projects, I think it’s definitely worth considering.

To get it up and going, I fired up a CentOS 6.4 64-bit Linux Virtual Machine. Installed the LAMP stack on top. And off to the races. The getting started guide provided by the product is pretty good, so, I won’t bother repeating it here. I will just give you a few examples of how I am using it in my home environment.

ALARMS

The first thing I would like a monitoring system to tell me – what are the current issues with my servers? Without doing any customization to the agents, I am able to see the current troubles…

issues

GRAPHS

That is a real-time view of what was going on.  But maybe this is normal?  Wouldn’t it be nice if you could see how your box has been performing over time?

Here are a is a graph showing CPU load on the “alpha” virtual machine over the last hour while I was generating load.

alpha-cpu

And a graph showing memory usage on alpha over the same period.

alpha-mem

Or maybe you want to predict when your file system is going to fill up?

saturn-disk

Or how much of the bandwidth you are buying, you are actually using?

saturn-network

NO AGENT?

And if you can’t install an agent on the device – maybe it’s an appliance – or locked down by the vendor. A nice easy “ping”‘ test will at least tell you it’s on the network – or more importantly – when it’s not!

issues2

WEB MONITORING

Another test that I am doing – is my www.paixao.ca available? I created a “scenario” to retrieve my home page to see if is on-line – and even retrieve a 5 MB file so that I can judge how fast (or slow) my hosting provider is.

paixaoca-speed

paixaca-response

NOTIFICATIONS

When a “problem” does show up – what can it do?  It can send you an email. Or if you have a modem attached, and are willing to pay, there is an SMS interface.

notifications

MAPS

My favourite feature those are the maps. Although it takes a little bit of effort to setup, you can create a map that represents your environment that reacts to alarms real time – giving you a graphical view of your environment.

map

You can quickly see what elements are in trouble, and even the relationship between elements. 

OTHER FEATURES

Only so much I can do in my home lab. There’s other features that in time I would like to get the opportunity to play with.

  • SNMP Traps coming from devices into Zabbix.

  • IPMI Interface into the hardware. (The OS doesn’t necessarily know there is a hardware fault – IPMI gets you a view into the hardware.)

YOUR TURN

Are you using Zabbix? Doing anything cool with it? 

ZFS – It’s that Easy

trainsignalI did another article for Train Signal… 

In the Solaris world, we have had access to the ZFS file system for quite a few years. It’s incredibly simple to use and incredibly powerful and flexible. It replaced the need for Solaris DiskSuite and Veritas Volume Manager, and even the UFS and VxFS file systems. Let’s get started with ZFS!

Continue reading on the Train Signal site…

Update
Train Signal was bought out by Plural Sight.
The article is now at the Plural Signal website.

http://blog.pluralsight.com/zfs-it%E2%80%99s-that-simple

SSH Jumpbox

trainsignalI wrote an article for Train Signal showing how to build an SSH Jumpbox to facilitate your job as a sys-admin.

If you are a UNIX sysadmin for any number of servers, you need to build yourself a Linux secure shell (SSH) jumpbox. Do it now! Having a centralized location that you can use to quickly “jump” to any box saves a whole bunch of time. Not only that, it opens opportunities for speeding up repetitive chores, and even automating tasks.

Continue reading at Train Signal Website…

Update
Train Signal was bought out by Plural Sight.
The article is now at the Plural Signal website.

http://blog.pluralsight.com/linux-ssh-jumpbox

Backups with CrashPlan

Twenty years ago, a power outage brought down a Novell server I was the sys-admin for… When the power came back, the  data volume was corrupt… And the backup was three weeks old… And 50 folks or so lost 3 weeks of email…  That’s when I learned the lesson that backups were important!!

In the last twenty years, I have successfully recovered from a number of hard drive failures, scripts gone wrong, and even the occasional stupid user error…  All because backups are pretty much the first thing I do once I build a new machine – whether for work or for home…

My current home solution for backups includes CrashPlan… Specifically, their “CrashPlan+ Family Unlimited” offering.CrashPlan

My main box is backed up in full to the Crash Plan cloud – and also through their software- to a local hard disk . The local copy is not because I don’t trust the cloud (I do), but simply to speed up a restore in the event of a major failure. My less critical (and smaller)  boxes are sent just to the cloud. There are even options to backup to your friends computers. Although I’ve not ventured down that path yet…

The initial seed is a bit of a nuisance… My experience shows that I can do it at about 100 GB/ day… So, it would take a while to do an initial backup… But the good news is of course  – you only need to do that full backup once… After that, only changed data gets sent to the cloud.

The backup is only as good as a restore…. So, you need to test that out. In my case, doing a full restore is impractical. But I will routinely bring back a few  GB’s just to prove that it works, and to get a feel for the rate to expect. (I brought back 5 GB’s in 3 minutes while writing this paragraph).

And as an added bonus – CrashPlan gets me remote access to my data… If I’m looking for a file, all I need is their App installed on my iPad, or any old web browser… And I can magically bring up that file… Even if I am not at home…

I would recommend Crash Plan to any home user looking for a backup solution…

(Before CrashPlan, I was a user of  EMC’s Mozy backup solution. And that too was great!  But their pricing model made it prohibitive with the amount of data that I have…)

OpenDNS and OpenDNS Umbrella

opendnsI have been using OpenDNS at home to keep the kids from getting onto questionable sites for over a year….

The process is quite simple…  Whatever device is doing DHCP for your home network (your wireless router, or maybe you have a residential gateway)… Just tell it to give out OpenDNS’s  name servers instead of the name servers provided by your ISP…

208.67.222.222208.67.220.220

That alone is sufficient to keep all you

But you can take it a step further…If you set up a free account with OpenDNS, and download and run a client somewhere in your home network (so they know your IP)… Then you can set up filters to block out DNS lookups to specific web site categories, or even individual websites on your home network…r internal devices from  known malware and phishing sites…

It’s not full-proof of course – but it was sufficient to keep my kids out of trouble..

opendns_umbrella

That was… Until one of them got a cell phone, and she figured out that all she needed to do was shut off her Wi-Fi, and access the sites over 3G..

Well… OpenDNS has a solution for that too… OpenDNS Umbrella

The “mobile” service offering –  probably meant for the corporate world – sets up a VPN tunnel between the phone and OpenDNS proxy servers…  And then leveraging that proxy to filter out web sites as above. Cost is $20/year for an individual plan…  

Again, not fail safe… And it does slow down the older phones (Apple 3GS) a bit… But it’s been effective at keeping her out of some sites.

For now any ways…

Netbackup Scripts – clean-jobs

NetbackupFar too many jobs end up in the Java GUI or in the bpdbjobs output… If the job is successful, why does it stick around for three days. This hides the real problems that may have occurred.

My solution is a cron job that deletes successful jobs (Status Code of 0 and 1).


#!/usr/bin/perl

use strict;

my $jobid; my $type; my $state; my $status; my $policy;
my $sched; my $client; my $media_server; my $pid; my @jobs;
my $delete_string;

my $cmd='/usr/openv/netbackup/bin/admincmd/bpdbjobs';
chomp(my @output=`$cmd`);
shift @output; # Skip the header record..

foreach (@output)
s/Image Delete/Image_Delete/;
($jobid, $type, $state, $status, $policy, $sched, $client, $media_server, $pid) = split (" ",$_);
 if (( $state eq "Done") && (SafeCode($status)==0)) {
 push @jobs, $jobid;
 }
}

$delete_string = shift @jobs;
foreach (@jobs) {
 $delete_string .= ",";
}

if (defined($delete_string)) {
 `$cmd -delete $delete_string`
}

exit;

sub SafeCode {
 my $query = @_[0];
 my @safecodes = qw / 0 1 /;
 foreach (@safecodes) {
 if ( $query == $_ ) {
 return 0;
 }
 }
 return -1;
}

Netbackup jobs that won’t go away..

Occasionally, you get a job in the queue that you can’t cancel/delete through the GUI nor through command line…   Here is the solution to that..

• Note the JobID from bpdbjobs
• Stop Netbackup with bp.kill_all
• /usr/openv/netbackup/bin/bpjobd –r {JobID}
• Restart Netbackup with /etc/init.d/netbackup start

The following thread discusses different attempts at solving th this issue, with this process being the final solution.

http://www.symantec.com/connect/forums/how-clear-and-delete-waiting-retry-status-50-job

Netbackup Scripts – client-last-backup

Quick report on the last backup for a client of your choice..

(Your mileage may vary with this one… I made some assumptions along the way that reflect my environment… But you can always use it as a starting point for your own version…)

#!/bin/ksh

[ $# != 1 ] &&
    echo "Usage: $0 {CLIENT_NAME}" &&
    exit 99

CLIENT=$1
/usr/openv/netbackup/bin/admincmd/bpplclients
    -allunique -noheader | awk '{print $NF}' |
    grep ${CLIENT} > /dev/null
[ $? -ne 0 ] &&
    echo "Error: ${CLIENT} is not a valid client." &&
    exit 99

TMP_FILE=/tmp/`basename $0`.${PPID}
bperror -client ${CLIENT} > ${TMP_FILE}

JOBID=`grep "nbjm started backup job" ${TMP_FILE} |
    egrep -v "oracle|catalog|DSSU" |tail -1 |awk '{print $7}'`

TYPE=`grep ${JOBID} ${TMP_FILE} |
    grep "nbjm started backup job" |
    tail -1 | awk '{print $20}'`

START=`grep ${JOBID} ${TMP_FILE} |head -1 |awk '{print $1}'`
START_PRETTY=`bpdbm -ctime ${START} |cut -c18-`

FINISH=`grep ${JOBID} ${TMP_FILE} |tail -1 |awk '{print $1}'`
FINISH_PRETTY=`bpdbm -ctime ${FINISH} |cut -c18-`

JOB_COUNT=`grep ${JOBID} ${TMP_FILE} |
    grep "successfully wrote backup" |
    wc -l | sed 's/ //g'`

TOTAL_SIZE=0
for SIZE in `grep ${JOBID} ${TMP_FILE} |
    grep "bptm successfully wrote backup" |
    awk '{print $20}' `
do
    TOTAL_SIZE=$((${SIZE}+${TOTAL_SIZE}))
done

SIZE_UNIT="KB"
if [ ${TOTAL_SIZE} -ge 10000 ];
then
    TOTAL_SIZE=$((${TOTAL_SIZE}/1024))
    SIZE_UNIT="MB"
fi

if [ ${TOTAL_SIZE} -ge 10000 ];
then
    TOTAL_SIZE=$((${TOTAL_SIZE}/1024))
    SIZE_UNIT="GB"
fi

echo " Client: ${CLIENT} "
echo " Start: ${START_PRETTY}"
echo " Finish: ${FINISH_PRETTY}"
echo " Backup Type: ${TYPE}"
echo " Backup Size: ${TOTAL_SIZE} ${SIZE_UNIT}"
echo " Job Count: ${JOB_COUNT}"
echo " File Systems/Volumes:"
for FILE_SYSTEM in `grep ${JOBID} ${TMP_FILE} |
    grep "handling path" |
    sed 's/Shadow Copy Components/Shadow_Copy_Components/' |
    grep -v "SNAP_ID" | awk '{print $NF}' | sort `
do
    echo ${FILE_SYSTEM} | grep MNTPOINT > /dev/null
    if [ $? -eq 0 ]; then
        FILE_SYSTEM=`echo ${FILE_SYSTEM} |
            awk -F"=" '{print $3}' |
            awk -F"," '{print $1}'`
    fi
    echo "    ${FILE_SYSTEM}"
done

rm -f ${TMP_FILE}

exit 0

And the output…

Client: xxxxx
 Start: Aug 30 18:01:21 2012
 Finish: Aug 30 18:07:41 2012
 Backup Type: Incremental_Backup
 Backup Size: 1380 MB
 Job Count: 5
 File Systems/Volumes:
    /
    /boot
    /home
    /usr
    /var

Netbackup via Command Line…

I live in a Unix Shell… Nothing that bothers me more than to have to launch a big heavy GUI to find out something simple..

Here are a couple of Netbackup scripts that I make use of on a daily basis to avoid the Java GUI..

client-list:  Lists clients currently backed up.


#!/bin/bash
/usr/openv/netbackup/bin/admincmd/bpplclients
    -allunique -noheader  | awk '{print $NF}'

SunFreeWare and SunSolve

SunFreeWare

For the longest of time, Sunfreeware (http://sunfreeware.mirrors.tds.net/) was my main source for pre-compiled Solaris packages. But they stopped updating it at the end of 2011. And the site that’s left doesn’t always work for me.

The official replacement site is http://unixpackages.com/, but as you’ll quickly discover, it is not free…

They have left the original FTP servers up and going. And they’ve saved the day a couple of times when looking for older versions of OpenSSL for example.

Sparc: ftp://sunfreeware.mirrors.tds.net/pub/sunfreeware/sparc/

Intel: ftp://sunfreeware.mirrors.tds.net/pub/sunfreeware/intel/

Here is hoping they don’t shut those down.

WeSunSolve

And who remembers looking for patches in SunSolve… Since it’s replacement with “My Oracle Support” (aka MOS), it’s never been quite the same.

I’ve recently discovered WeSunSolve.  It has tonnes of info on Solaris Packages and Patches and BugID’s. You’ll still need your MOS credentials to download the patches, but I find this site a lot easier to use than MOS.

http://wesunsolve.net/

New WordPress…

Here is a new WordPress site to try to consolidate a couple of other blogs into this one location…

Step 1 – Pick a theme.    After wasting an hour browsing through a dozen themes, WordPress’ own “Twenty Ten” seems adequate enough.

Step 2 – Test “Source Code”… Always been frustrating to find a way to share a chunk of source code or script… Google says it’s this easy with “SyntaxHighlighter Evolved”

Let’s try it…

#!/bin/ksh
for NAME in Nuno Tracie Julianna Ryan
do
  echo "Hello ${NAME}."
done
exit 0

Hey it worked… 🙂

Here’s the link. http://en.support.wordpress.com/code/posting-source-code/

Step 3 – Let’s import my Tumblr… There’s a plugin for that too.

Stand by……..
That sort of worked… I had to update the “source code” bits…
Bye Bye Tumblr.

Step 4 – Let’s see if I can bring my old WordPress Blog.

Stand by again……….
And success…
Have to clean up my categories though…

Dropbox Cache Growing Out of Control…

[Originally Posted to Tumblr. ]

I love DropBox… But it’s not perfect…

Over the last couple of days, one of my machines has been having trouble keeping in sync… The trouble is limited disk space…   So, I went looking for where this disk space was being taken up, and DropBox was the culprit…

Whenever a file changes on one of your other linked machines, DropBox grabs the latest copy (as it should), but it moves and renames the current copy to a .dropbox.cache folder. It stays there for up to three days.

To solve the problem, a little batch file…

@echo off
taskkill /f /im Dropbox.exe /t
del /s /f /q  D:Dropbox.dropbox.cache*
start “” “%APPDATA%DropboxbinDropbox.exe”
exit

You’ll have to adjust the D:Dropbox path to match the location of your Dropbox.

For now, I’m just running as needed. But I am considering putting into a scheduled task.

You don’t have a Dropbox yet? Get one!! Follow this referral link for an extra 500 MB. http://db.tt/A8rBVfu

Netbackup Client Uninstall in Solaris

As far as I can tell, Veritas does not provide an uninstall script to remove the Netbackup 6.0 Client from Solaris 10 Servers…

I went looking to see what files the client installed, and what  services were started up, and based on that here is my best guess as to how to cleanly remove the client.


svcadm disable network/bpcd/tcp
svccfg delete network/bpcd/tcp

svcadm disable network/vnetd/tcp
svccfg delete network/vnetd/tcp

svcadm disable network/vopied/tcp
svccfg delete network/vopied/tcp

svcadm disable network/bpjava-msvc/tcp
svccfg delete network/bpjava-msvc/tcp

cd /var/svc/manifest/network
rm bpcd-tcp.xml vnetd-tcp.xml vopied-tcp.xml
   bpjava-msvc-tcp.xml

vi /etc/inetd.conf
(Remove bpcd vnetd vopied & bpjava-msvc)

vi /etc/services
(Remove bprd bpcd vnetd vopied & bpjava-msvc)

pkill –HUP inetd

rm –rf /usr/openv
(Make sure you're not dealing with file system first..)

 

New Home PC

After months of deliberation, I finally settled on the components
that will make up my new Home PC…

It’s fast… It’s quiet… And I’m quite pleased with it so far..
(This is not a gaming system – that was never the intent – but
it should run modestly most currently shipping games.)

Motherboard: ASUS P8P67 Deluxe

CPU: Intel Core i7 2600K Quad Core Processor – 3.4 GHz

Case: Fractal Design Define R3 Black Silent Computer Case

Power Supply: Corsair Professional Series Gold AX850

Cooling: Coolit Systems Eco CPU Water Cooling

RAM: G.SKILL Ripjaws – – DDR3-1600/CL7 – 2*4GB

OS Disk: OCZ Vertex 3 Sandforce Solid State Disk – 120 GB’s

Data Disks: 2 x Western Digital Caviar Green – 2 TB’s each

Optical Disk: LG Super Multi 22X DVD Writer

Graphics: ASUS Radeon HD 6870 DirectCU

Operating System: Windows 7 Professional