Loading

Thursday, December 29, 2011

Update Windows Path Environment Variable

Today I had to make similar changes to 10 Windows servers.  The changes consisted of copying a couple specialized programs from a source location and updating the Windows path.  I couldn't remember how to update the Windows path variable from the command prompt (other than for the current command prompt of course) so I tooled around the Internet a bit until I found it.  But as I was looking for the solution I found a lot of confusing and incomplete information about the Windows path variable.  Most of the info I found consisted of brief posts often followed by several discussions asking for more information.  Not knocking anyone here but most of this appeared to be by developers and not system admin types.  So I wanted to provide a thorough discussion on the Windows path variable, what is is, how it works and how to use it.

The Windows path variable has its roots way back in MSDOS, which means it's been around for over 30 years.  Basically the path is a list of directories (or folders) to be searched when a command runs.  For example say you're at the root of the C: drive in a Windows command prompt.  If you execute a command like robocopy Windows will look in the current directory first for the executable, robocopy.exe.  Since robocopy.exe isn't in the root of C: by default, and if you didn't have a path variable defined with the correct path you would get the message, "'robocopy' is not recognized as an internal or external command, operable program or batch file."  But since Windows by default has at least "C:\Windows\system32;C:\Windows" (usually more) in the Windows path the first directory, C:\Windows\system32, will be searched for robocopy.exe, and since it's there the program will run.  If it weren't there then the next directory in the path C:\Windows would be searched, then the next and so on until the program is either found or not.

The Windows path will always be searched whenever a command is executed (that isn't a core command like dir, or when the executable isn't found in the current directory) either from within a command prompt or Windows itself.  Remember it's always possible to specify the path and executable like with C:\Windows\system32\robocopy.  This will launch the command or program directly without having to search the path.

One of the quickest and easiest ways to view your current path is to open an command prompt and enter the command, path.


Modifying the Windows Path Environment Variable
It's worth noting that often when a program is installed the path to your application may be added to your path.  Your path will often consist of a dozen or more entries (in fact, I stripped down my path for the previous screen shot to show only a minimal path).

Back in the days of DOS and even Windows 95/98 and Windows NT the path was set through either (or usually both) files called comspec.sys and autoexec.bat.  In order to make a permanent change to your path it was common to edit either or both of these files.  But since these operating systems are so old and probably not used by many we'll skip the details.

So starting with Windows 2000 and XP (right up to and including Windows 2003, Windows 2008, Windows Vista and Windows 7, and I'm sure Windows 8) Microsoft did away with config.sys and autoexec.bat and moved the path variable (and others) into the Windows registry.  Of course you could edit the path directly in the registry but that's usually not a good idea and since there are other ways that's what we'll look at.

First, it's important to understand that there are different path variables; system, user, and what I like to call context (or session) variables.  For example when you open a command prompt the system and user path variables are read from the registry and assigned to that command prompt.  If you use the path command to modify the path it will only be modified within that command prompt session, or the "current context."

Let's say you are using a program while in a command prompt and want to add the path to that program to your path just for the current session.  You would simply enter the command set %path%;<path to new program>.  This would take your current path (as specified with %path%) and add or append the new path.  NOTE: use a semi-colon (;) between path entries.

For this example I added the path to a useful Amazon S3 utility called S3.exe.

Example: set path=%path%;C:\Admin\Utils\S3exe


Now I can run S3.exe without having to specify the full path to the command.  However, when I close the command prompt and open a new one this entry will not be present.  I'd have to add it again.  Or, add it to the Windows registry.

Setting Path Variables (semi-)Permanently
There are a few ways to make the path variables (both system and user) permanent.  At least they are permanent until you change them....

Settings for both user and system path variables (and a lot of others) are accessed through the System Properties applet.  Here are two ways to access System Properties.
  1. Right-click on Computer (formerly My Computer), click Properties.  Within the System window click Advanced System Settings.  This will open the System Properties window.
  2. OR, click Start, Settings, Control Panel, System, Advanced System Settings.   This will also open the System Properties window.

Click Environment Variables in the lower-right.  This will (finally) open the Environment Variables dialogue window.


The Environment Variables window allows you to add/edit entries to both the current user's path and the system path.  Entries in the system variables path will apply to all users on the system, whereas the user variables will apply only to the currently logged on user - i.e. you.

Just like with about everything else there's another way to do this and that's with the setx command from a command prompt (my preferred method.)  Using setx you can make permanent changes to either the user path variable or system path variable (or both).  NOTE: use quotes where shown in the commands.  The /M switch is used to set system path variable.  Both commands are written to the system registry as with the previous examples.

Example (user path variable): setx path "%path%;C:\Admin\Utils\S3exe"
Example (system path variable): setx path "%path%;C:\Admin\Utils\S3exe" /M


NOTE: these commands do NOT update your current path (in the command prompt), so you'll have to open a new command prompt for the path variables to include new/updated entries.

In the beginning I mentioned I had to make similar changes to several servers.  I was able to use robocopy to copy the programs/files I needed and setx to update the system path variable.  My commands consisted of about 5 lines that I just copied and pasted into the command prompt on each server and my "deployment" was completed in minutes using these simple commands.

Wednesday, November 9, 2011

Optimizing Images to Save Bandwidth on AWS

Last month our finance guy came to me in a bit of a panic to point out that our Amazon Web Services (AWS) bill was way higher than expected - by several thousand dollars.  After the initial shock wore off I started digging to figure out just what was going on.

From Amazon's bill I could easily determine that our bandwidth costs were way up, but other expenses (like EC2) were in line with previous months.  Since I have several S3 buckets that house content served from S3 and CloudFront, and since Amazon doesn't break down the costs enough I had to figure this one out on my own.

We serve millions of page views per day, and each page view causes several calls to different elements within our infrastructure.  Each call gets logged, which makes for a lot of log files - hundreds of millions of log lines per day to be exact.  But, this is where the detail I needed would be found so I had to dig in to the log files.

Because I collect so much log information daily I haven't built processes (yet) to get detailed summary data from the logs.  I do however, collect the logs and do some high level analysis for some reports, then zip all the logs and stuff them to a location on S3.  I like to hang on to these because you never know when a) I might need them (like now), and b) I'll get around to doing a deeper analysis on them (which I could really use, especially in light of what I've uncovered tracking down this current issue).

I have a couple of under-utilized servers so I copied a number of the log files from S3 to my servers and went to work analyzing them.

I use a plethora of tools on these log files (generally on Windows) such as S3.exe, 7zip, grep (GNU Win32 grep) and logparser.  One day I'm going to write a post detailing my log collection and analysis processes....

I used logparser to calculate the bandwidth served by each content type (css, html, swf, jpg, etc.) from each bucket on a daily basis.  My main suspect was image files (mostly jpg) because a) we serve a lot every day (100 million plus), and b) they are generally the biggest of the content we serve from S3/CloudFront.

Since my log files are so voluminous it actually took several days to cull enough information from the logs to get a good picture of just what was happening.  Even though I'd gotten over the initial shock of the increased Amazon bill I was a bit shocked again when I saw just what was happening.  Essentially, overnight the bucket serving my images (thumbs & stills) went up 3-4 times on a daily basis.  This told me either a) we were serving more images daily (but other indicators didn't point to this), or b) our images were larger starting that day than they had been previously.

Well, that's exactly what it was - the images were bigger.  Turned out a developer, trying to overcome one bottleneck, stopped "optimizing" our images and the average file size grew by nearly 4 times - just about the amount of bandwidth increase I'd found.  So, in order to save a few minutes per day and even a couple bucks with our encoder this one thing ended up costing us thousands of dollars.

Once I identified the issue I began working with our developers to fast-track a solution so we can accomplish all our goals: save time, save money encoding/optimizing the images, and get them small as possible to save on bandwidth with Amazon therefore saving a lot of money.  I also went to work to identify the images that were too big and optimize them.  In fact, this is actually an ongoing issue as our dev team hasn't quite implemented their new image optimization deployment, so I actually grab unoptimized (i.e. too big) images a few times a day, optimize them, then upload to S3.  This process probably justifies its own post so I'll do that soon & link to it from here.

Delete Files by Date With DOS Batch File

I have a server that is continuously filling it's drive with temporary files left over from a video encoding process we use.  Every week or so I have to manually delete the old files to prevent the drive from filling.  Finally today I headed on a journey to figure out a way to programmatically clean up these old files regularly.  So I tooled around the Internet to find a way to do it from the DOS command line.  Unfortunately others had run into the same issue I was, that DOS commands like del don't have a way to delete by date.  I found a few approaches using complex batch files & even tried one for a few minutes, but when I couldn't get it to work I went back to the drawing board.

I found a really old post suggesting using xcopy to copy the newest (presumably the ones to keep) files to a temp location, then delete the original files, then move those copied away back to their original location.  This approach had some promise, but had some real drawbacks too; particularly that I'm dealing with tens of gigabytes & this would take forever, and that I'm dealing with several and varying subdirectories.

Since xcopy has been deprecated and replaced with robocopy in Windows 2008, Windows 7, etc. that's what I chose to use.  In fact, robocopy has a couple switches that make it easy to move (/MOV) files older than x days (/MINAGE:x).

What I ended up with was a simple two line batch file that I'm using Windows task scheduler to run once a day.  The first line moves files older than x days to a temporary location & the second line deletes the moved files.
robocopy D:\Original_Location D:\DelOldFiles * /S /MOV /MINAGE:7
del D:\DelOldFiles\* /s /q
Breakdown
The robocopy syntax is ROBOCOPY source destination [file [file]...] [options].  I'm using the following switches (options):
  • /S - include subdirectories
  • /MOV - move (which is technically a copy, then delete original)
  • /MINAGE:x - act on files older than x days
After the files are moved I'm deleting all files (*), in all subdirectories (/s) in my temp folder, quietly (/q), i.e. don't prompt, just do it.

See also
Using dos to delete based on date

Thursday, October 27, 2011

ASA 5500 SSL VPN Add Licenses to ASA

I recently had to enable some of my mobile Mac clients with Cisco AnyConnect VPN Client for Mac.  Then, of course since the ASA only included 2 SSL VPN licenses and that's what the AnyConnect VPN Client uses I had to purchase some additional licenses.  I purchased the licenses through a reseller & a couple days later they sent me a PDF listing the product (L-ASA-SSL-10= ASA 5500 SSL VPN 10 Premium User License) and a Product Authorization Key.

First, go to the Cisco Product Registration Page and login with your TAC credentials.  In the Product Authorization Key (PAK) field enter the Product Authorization Key from your PDF then click submit.

Next, follow the prompts and agree to their end user license agreement.  You will have to provide the ASA's serial number which can be obtained from the chassis or via show version from the CLI (this is probably the best method as you can copy the S/N from the CLI, then paste it to the authorization screen).

Now wait.

After submitting the required information and verifying other info you'll see the following message indicating that you'll have to wait up to one hour to receive an email with the xxx.  You'd think Cisco would be able to provide this info right away.  Guess not.

You'll be presented with the following helpful message to read while you wait...
Your license and user information will be sent via email within 1 hour to the email address you specified. If you have not received an email within 1 hour, please open a Service Request using the TAC Service Request Tool. Please have your valid Cisco.com user Id and password available. As an alternative, you may also call our main Technical Assistance Center at 800-553-2447.
Please be sure to check your Junk/Spam email folders for this email from licensing@cisco.com with your license key attached.
Fortunately only a few minutes later I received the email with the ASA activation key (which is 77 characters  long) and the following instructions.

Installing Your Cisco Adaptive Security Appliance Activation Key
Step 1.  From the command line interface (CLI), enter configuration mode using the "conf t" command.
Step 2.  Type the "activation-key" command, and then, when prompted, enter the new activation key listed above.
Which I promptly followed.  Now I have 10 licenses with which to connect my clients.  This by the way is a bit of a disappointment as I already had two.  I would have hoped Cisco would have preserved the two gratis WebVPN licenses and added my 10 new ones.  Not so luck.

Friday, October 21, 2011

Cisco AnyConnect VPN Client for Mac

Recently some of our mobile users needed to connect to one of our networks that's protected by a pair of Cisco ASA firewalls.  It was no problem for the Windows users as I already had what I needed in place, however it was a different story for our Mac users.  Since it had been a while since I setup the ASA for AnyConnect for Windows I'd forgotten everything that was needed so I ran into a little trouble.

First, I downloaded the latest AnyConnect VPN client for Mac's from Cisco (anyconnect-macosx-i386-2.5.3055-k9.dmg at the time of this writing), and installed it on a MacBook Pro.

Notes:
  • Of course, you'll have to have a valid SmartNet agreement and account with Cisco to access these files.
  • And, since the Cisco VPN client only runs on 32 bit Mac's, AnyConnect is the only option for 64 bit Mac's.
With the AnyConnect VPN Client installed on the Mac I launched it and tried to connect to my ASA.  Here's when I ran into my first problem, receiving the message,"The AnyConnect package on the secure gateway could not be located. You may be experiencing network connectivity issues. Please try connecting again."


After a little research I realized I needed to upload the accompanying package (.pkg) file to the ASA.  So I headed back to Cisco to download the package file (anyconnect-macosx-i386-2.5.3055-k9.pkg - must match the version of the AnyConnect VPN Client on the Mac).

With that in hand I copied it to the ASA via TFTP, after, of course, dusting off my (FREE!) SolarWinds TFTP Server I haven't used for quite some time.  Here's the (Cisco) IOS command to copy the file via the terminal:
copy tftp:anyconnect-macosx-i386-2.5.3055-k9.pkg disk0:
Of course you'll have to provide the name/IP address of your TFTP server, which will conveniently be asked.

With that in place I tried again to connect.  However, I had the same problem, again receiving the message,"The AnyConnect package on the secure gateway could not be located. You may be experiencing network connectivity issues. Please try connecting again."  WTF?

Oh, yeah, I had to register the Mac AnyConnect package with the ASA's IOS.  Since I already have the Windows AnyConnect package registered as #1, and since most who connect to my ASA are Windows clients I left that in the first position and registered the Mac package second with the following commands:
config terminal
webvpn
svc image disk0:/anyconnect-macosx-i386-2.5.3055-k9.pkg 2
Then, by running show webvpn svc I can see that both the Windows and Mac AnyConnect packages are registered with my ASA.  


And I can successfully connect my Mac clients.  Booyah!!!

Need help adding SSL VPN licenses to your ASA 5500?

Monday, October 10, 2011

Expanding a Virtual Disk on a Dell MD 3000i SAN - How To

If you're like me you don't allocate all disks to a SAN out of the gate.  I like to keep a little in reserve so I can add capacity when needed.  Then, once all the disks are added I'll usually pickup a couple more disks and keep those in reserve.  In either case once it comes time to add capacity to a virtual disk on a Dell MD 3000i SAN it can be a little tricky.

This will be accomplished in two steps.  First add the capacity of one or more physical disks to a disk group.  Next expand the virtual disk.  The first step is rather easy and done through the Modular Disk Storage Manager utility.  Step two is a little tricky as it uses the Dell smcli command line utility.

Step 1 - add one or more drives to a Disk Group
  • Open the Dell Modular Disk Storage Manager utility
  • Click the Modify tab
  • Under the Storage subsection click Add Free Capacity (Physical Disks)
  • Select your disk group, click next
  • Select the capacity/number of disks, click finish
Now if you go back to the Summary tab and click Disk Groups & Virtual Disks you can see that you have free space available.


NOTE: This step can take some time to complete. Depending on the size and type of RAID you are running, it may take several hours or more (even days!) to complete. It will not take the disk group down, but may slow things a bit.  Also, this MUST complete before you can perform the next step!  If you jump the gun and run step two prematurely you'll receive the message, "Error 11 - The operation cannot complete because a virtual disk is performing a modification operation..."




Step 2- Expanding a Virtual Disk
  • Decide how much space to add in Bytes.  You could use a calculator such as this or this bit calculator.
  • On the computer running Dell Modular Disk Storage Manger, open a command prompt.
  • Navigate to Program Files\Dell\MD Storage Manager\client OR Program Files (x86)\Dell\MD Storage Manager\client if you are on a 64 bit machine.
  • Use the smcli command to expand the disk. Examples below.
smcli Syntax: smcli ArrayName -c "set virtualDisk ["virtualdiskname"] addCapacity=virtualdiskcapacityinbytes;"

Example smcli command - assumes the following:
  • MD3000i named SAN1
  • Virtual Disk named Disk1
  • Want to add 500GB to virtual disk
smcli -n SAN1 -c "set virtualdisk [\"Data1\"] addCapacity=536870912000;"
While this expands the capacity of the SAN virtual disk the operating system may not automatically recognize it.  On Windows 2008 server perform the following:
  • Open Server Manager 
  • Go to Storage, Disk Management
  • Right-click the desired Windows volume and select Expand Volume
  • Follow the prompts in the Extend Volume Wizard

Thursday, October 6, 2011

Cisco ASA ASDM Install and Download

A few months ago I got a new work computer.  Since it was a good opportunity to start fresh I didn't transfer all the programs from my old one to the new.  And one of them I neglected to transfer was ASDM.  I wasn't too worried about it since I often manage my ASA firewalls via the terminal using PuTTY.  But I had a need for ASDM recently so I downloaded it and ran it from my computer, but to my chagrin I received the message, "Unable to launch device manager from..."  Crap!

Since I couldn't remember exactly how to download ASDM from my ASA and it took me a bit to figure out.  And since I wanted to upgrade to the latest anyway, I thought I'd write myself a reminder here so in a year or two when I get a new computer I don't have to go through this same trouble again.  I hope others find it useful as well.

First, download the latest ASDM bin file from Cisco (you'll have to have a valid SmartNet contract to access the downloads section).  At the time of this writing the latest version is asdm-645.bin.

With that downloaded use something like TFTP to copy the file to the ASA.
config terminal
copy tftp: disk0:/asdm-645.bin
Next, register the ASDM bin with the ASA.
config terminal
asdm image flash:asdm-645.bin
NOTE: the ASDM version needs to be compatible with the IOS on the ASA.

Finally, access the ASA's admin interface with https://<LAN_interface_IP&rt;/admin.  NOTE: this has to be on the LAN interface, either from a computer running inside the ASA, or for an external computer connect via VPN, then access ASA's LAN interface.


Click the link, "Install ASDM Launcher and Run ASDM."  Follow the steps to install and connect to your ASA.

Wednesday, October 5, 2011

Steve Jobs 1955-2011

I'm no fan of Apple, but as a techie I have a profound respect for Steve Jobs and everything he's done for my industry, for computing in general and for consumer electronics.  Well done Steve!


Monday, October 3, 2011

Amazon ELB & IIS - Capturing Client IP Address

I've been using Amazon EC2's Elastic Load Balancer (ELB) for a couple years now to load balance web applications, and for the most part it's been great.  The one draw back I've run into is that IIS logs the load balancer's private IP address as the c-ip address, rather than the client's actual IP address.  Essentially the ELB acts like a NAT device.  This can be a problem when trying to troubleshoot requests to your IIS sites.  And is just plain annoying.

So I finally did a little digging on this and found a simple and elegant solution.  That is for IIS to log the IP address value of the X-Forwarded-For request header which ELB populates with the client IP address when it forwards the request to IIS.

Start by downloading the IIS X-Forward-For ISAPI Filter from F5 (click here for more information), and extracting the files.  There's a lot here, including source code, but all you need is the appropriate F5XForwardedFor.dll, either x86 (32 bit) or x64 (64 bit).  To make it easy I copied mine to the root of C:\inetpub, i.e. C:\inetpub\F5XForwardedFor2008\x64.

Next, open IIS Manager, highlighting the server name in the Connections pane.  In the <servername> Home pane double-click ISAPI Filters.  Then in the Actions pane (upper-right corner) select Add.  Give the filter a name (I used Xforward) and specify the exact location of the Executable (F5XForwardedFor.dll).

NOTE: by adding this at the server level it will apply to all sites on the server.


Click OK and you're done.  Now, sit back, relax and wait for your server logs to accumulate.  Here's a view of an IIS log after enabling the F5XForwardedFor ISAPI filter.


NOTE: After installing this ISAPI filter I did notice a slight CPU load increase on my IIS servers, around 1-2% more.  Basically my servers average between 10% - 30% under normal load, now they average about 12% - 32%.  Not much, but noticeable, but in my opinion worth the load.

Thursday, August 11, 2011

Copying an EBS-backed Windows 2008 AMI Between AWS Regions - How-to

Unfortunately Amazon doesn't have an easy or native way to copy or move or launch an AWS AMI from one region to another.  There are a number of posts on the Internet about how to do this with a Linux AMI, but I haven't been able to find clear instructions on how to do this for a Windows AMI.  So, here we go...

The basic steps involve starting a temporary Windows instance in each of the two regions, say us-east-1 and us-west-1; attaching the EBS boot volume to the server in the "source" region; making an image or zip file of the volume; copying the volume to the temporary server in the "destination" region; extracting the file to a new volume; finally, attaching that volume to a server.

Notes:
  • This will take a while (several hours), particularly if your AMI is large.
  • Clean up the boot volume of the image you want to copy to another region by deleting any unnecessary files, such as temporary files, etc. as it will reduce the overall time this process takes.
  1. Launch a temporary Windows instance in the “source” region and another one in the “target” region. (You may not have to launch temporary servers if you have one available in each that can handle the load of compressing a large volume which can be quite CPU intensive.  Additionally you need to have enough available disk space for the compressed volume to be stored temporarily.)
  2. Determine the snapshot for the boot volume you want to migrate (you must own the AMI) using the command ec2-describe-images.


  3. Create a new EBS volume from that snapshot in the same zone as your origin server.  This can be done many ways, but here's how to do it from the command line using the AWS tools:
    ec2-create-volume --snapshot <snap ID> -z us-east-1c
  4. Attach that volume to your temporary server instance as, say, “xvdg”:
    ec2-attach-volume <volumeID> –i <instanceId> –d xvdg
     NOTE: You will be able to browse the contents of the volume in Windows Explorer.

  5. Connect to your temporary source server with RDP.
  6. Zip the entire contents of the newly attached volume.  NOTE: I used 7zip and sent the zipped file to another volume on the temporary source server in 1GB chunks (this makes it easier and quicker to transfer to the destination server; in particular you can begin copying the chunks soon as each is finished rather than waiting for one large file.)
  7. Copy that zipped file (or files) to your instance in the “target” region.  This could be done by a variety of methods.  I chose to copy my 1GB chunks to an Amazon S3 bucket, then I could easily download those onto the destination server.

    In the target region:

  8. Create an EBS volume of appropriate size (30GB for Windows 2008 by default) and attach it to your temporary destination server.
  9. Unzip the file to the new volume.  Again, I used 7zip.
  10. Detach the volume.

    Now, for the Windows specific stuff….

  11. Launch a basic Windows 2008 instance of the right architecture (32 or 64 bit).
  12. Soon as the instance is “running” (see Pinging Amazon EC2 Instances to determine exactly when your instance is available) stop it and wait for its state to become “stopped”.
  13. Once it stops, detach its “/dev/sda1” volume and delete it using ec2 commands:
    ec2-detach-volume <the_volumeID_of_sda1> –i <new_windows_instance>
    ec2-delete-volume <the_volumeID_of_sda1>

  14. Now attach the new volume (from steps 8-10) to the stopped Windows instance as ‘/dev/sda1’:
    ec2-attach-volume <vol_id> –i <windows_instance_id> –d /dev/sda1
  15. Start the instance to make sure it boots, and connect to it with RDP.
  16. When you’re satisfied that it boots and is setup the way you desire, stop the instance using ec2stop -i <instanceID>.
  17. Finally, create an AMI from that server by running ec2-create-image -i <instanceID>.
That's it.  Now you have an AMI in a different AWS zone that is a copy of one from your initial zone.