Compiling FreeTDS on Windows

This is how I got FreeTDS to compile on Windows 7 Professional with Visual Studio 2008.

1. Install Visual Studio 2008 (a.k.a.: Visual Studio 9.0)
2. Download the FreeTDS 0.91 source tarball and extract it somewhere. I chose C:\Users\marca\src\freetds-0.91.
3. Open a Visual Studio 2008 Command Prompt.
4. Change to the directory with FreeTDS:

cd C:\Users\marca\src\freetds-0.91

(or whatever you chose)
5. Give this a shot — it will fail probably:

nmake -fNmakefile -nologo apps PLATFORM=win32 CONFIGURATION=debug

6. You will get a bunch of errors that look like this:

c:\users\marca\src\freetds-0.91\include\tds_sysdep_private.h(239) :
fatal error C1083: Cannot open include file: 'inttypes.h':
No such file or directory

7. Fix the inttypes.h error — edit win32/config.h and comment out this line (line 92 for me):

#define HAVE_INTTYPES_H 1

8. If you repeat the above nmake command, then you’ll get a new error:

c:\program files (x86)\microsoft visual studio 9.0\vc\include\stdio.h
(358) : error C3163: '_vsnprintf': attributes inconsistent 
with previous declaration
        c:\program files (x86)\microsoft visual studio 9.0\vc\
        include\stdio.h(350) : see declaration of '_vsnprintf'

9. Fix the _vsnprintf error by editing include/tds_sysdep_private.h and commenting out the second instance of this line (line 96 for me):

#define vsnprintf _vsnprintf

10. Repeat the nmake command and this time it should succeed and you will end up with a tsql.exe file.

C:\Users\marca\src\freetds-0.91>nmake ^
More? -fNmakefile -nologo apps ^
More? PLATFORM=win32 CONFIGURATION=debug
...
C:\Users\marca\src\freetds-0.91>dir src\apps\win32\debug\tsql.exe
 Volume in drive C has no label.
 Volume Serial Number is D0C5-FA7C

 Directory of C:\Users\marca\src\freetds-0.91\src\apps\win32\debug

07/18/2013  04:19 PM           847,872 tsql.exe
               1 File(s)        847,872 bytes
               0 Dir(s)  19,728,400,384 bytes free

11. Give it a spin:

C:\Users\marca\src\freetds-0.91>src\apps\win32\debug\tsql ^
More? -H <server> -p 1433 ^
More? -U <username> -P <password> -D <database>
...
1> SELECT @@version
2> GO

Microsoft SQL Server 2008 R2 (SP2) - 10.50.4279.0 (X64)
        Mar 26 2013 17:33:13
        Copyright (c) Microsoft Corporation
        Developer Edition (64-bit) on Windows NT 6.1  
       (Build 7601: Service Pack 1) (Hypervisor)

(1 row affected)

A few useful links:

Plone and Zope time

Doing my yearly ritual of looking at Plone & Zope and making sure I still don’t grok them. Check.

In this year’s installment, I set up a little Plone site and actually got as far as creating a custom content type. I started down the road of thinking I would create the content type manually with code in Archetypes. Then I looked at tutorials and saw that it looked like it involved a lot of repetitive code and configuration voodoo. I ended up using ZopeSkel to create a really, really simple Archetypes content type, which actually didn’t take too long to figure out (to my pleasant surprise).

It still feels like too much work and it’s difficult to find good documentation on what’s the current recommended way to do things, because the recommended way of doing things seems to change every couple of years and there is a lot of old, outdated documentation out there. A search on Archetypes for instance yielded all kinds of pages referring to practices that seem to be outdated now, like manually creating the directory structure for an Archetypes product (i.e.: no ZopeSkel) and copying the product into the Zope 2 “Products” directory (which seems to be deprecated in favor of buildout)

In comparison, creating a model in Django is very simple, Pythonic, and well-documented.

Carbonite woes

I’ve been using Carbonite for a while and it’s worked well most of this time (though how do you know if a backup tool is working?), but recently I’ve started to notice a few issues with it and I may end up switching to another cloud backup service (I’m currently evaluating CrashPlan and another part of me is seriously considering going old-school and using good ol’ rsync or maybe just cloning to a bunch of spare hard drives and then leaving them with friends and family). Some if these issues might not be serious, but when it comes to a backup tool, it makes me nervous when I see weird unexplainable behavior. I like my backup software to just work. And that is why I’m detailing everything weird I saw, even if some of it is the inevitable stuff that you get when you peek too closely into the sausage factory.

Issue 1: Backups can stop progressing; no alerts to tell me this

A couple of weeks ago, I noticed that Carbonite had been “stuck” – there were about 1500 files that were pending backup and that number was not shrinking.

I happened to notice this because we had taken a lot of photos and I decided to check whether they were getting backed up. If I had not checked and our hard drive crashed, that could’ve been really bad.

This stuck backup lead to me spending quite a few hours measuring, diagnosing, disk repairing, and tweaking and it looks like things are working better now (I have only 42 files pending backup now). It also lead to the discovery of other issues that are detailed below. But the main point I want to make here is that I should not have to notice that backups have stalled. Carbonite should detect that it is not progressing and send me an alert email and/or popup on my screen.

Issue 2: Carbonite not backing up because CPU Idle is too low (and Carbonite is using the bulk of the CPU)

This is possibly one of the big reasons for the Issue 1.

When my backup wasn’t progressing, I looked for clues by looking at /Library/Application Support/Carbonite/Data/Carbonite.log and noticed that CPU Idle was almost never at the CPU Idle target of 75%. I’m wondering if this is causing Carbonite to not back up? I tried quitting every program on my computer (including all sorts of menu bar apps and background apps such as Cloud.app, Dropbox, Evernote, Jenkins, etc). The only thing that is using a lot of CPU, according to Activity Monitor, is Carbonite itself.

At this point, I started looking at what Carbonite was looking at using the fs_usage command.

Ideally, I shouldn’t have to quit every program on my computer in order to get the CPU usage low enough for Carbonite to be able to do backups. The beauty of a cloud backup app like Carbonite is that it just should just do its thing in the background without me having to change my workflow. When it’s working well, this is how it works. I’m not sure of why it has been stumbling lately.

Issue 3: Internal program errors in the logs

I also noticed some internal errors in the log and I wonder if Carbonite is crashing when it tries to back up certain files and thus is unable to make progress – e.g.:

1346811842  = 19:24:02
1346811842  @ 2954915840: Backup progress: completed 38%, file counts last 2 22:10:34 All: 3617 (1081M bytes) Unique: 3225 (731M bytes), Compressed: 848M bytes.
1346811842  @ 2954915840: Backup progress: remaining time unknown, file counts All: 563085 (217G bytes), Pending: 1396 (1767M bytes).
1346811842  @ 2954915840: Backup progress: task seconds last minute, Active: 0 (0%) Response Waiting: 61 (100%) Send Waiting: 0 (0%) Sending: 61 (100%) 
1346811873  ! 2957045760: Internal program error: ASSERT "LogFile.SetFilePointer(dwBufferPosition)" (/Library/Bamboo/home/xml-data/build-dir/MAC-CLI-JOB1/Client/Mac/daemon/../../../Shared/Common/LogMsg.cpp:735)
1346811873  @ 2957045760: DumpStack: stack trace not available
1346811873  ! 2957045760: Internal program error: ASSERT "LogFile.SetFilePointer(dwBufferPosition)" (/Library/Bamboo/home/xml-data/build-dir/MAC-CLI-JOB1/Client/Mac/daemon/../../../Shared/Common/LogMsg.cpp:1255)
1346811903  = 19:25:03

This does not inspire confidence.

Issue 4: Delaying backup due to zero length file modified within the last minute

Carbonite says that it’s “delaying backup due to zero length file modified within the last minute” – the issue is that these files are zero-length but they have not been modified within the last minute. For example:

1347897240  # 2956513280: Delaying backup due to zero length file modified within the last minute: "/Volumes/Momentus/Users/marc/Library/Preferences/MaxBack .mxbk".
1347897240  # 2956513280: Delaying backup due to zero length file modified within the last minute: "/Volumes/Momentus/Users/marc/Dropbox/Camera Uploads/Icon ".
1347897240  # 2956513280: Delaying backup due to zero length file modified within the last minute: "/Volumes/Momentus/Users/marc/Library/Application Support/Evernote/accounts/Evernote/msabramo/note.index/write.lock".
1347897240  # 2956513280: Delaying backup due to zero length file modified within the last minute: "/Volumes/Momentus/Users/marc/Library/Application Support/Google/Chrome/Default/Extension State/LOCK".

But the apps that write these files are not running and these files have not been modified recently at all.

$ date
Mon Sep 17 09:19:25 PDT 2012

$ ls -l "/Volumes/Momentus/Users/marc/Library/Preferences/MaxBack .mxbk"
-rw-r--r--  1 marc  marc  0 Sep 11 02:40 /Volumes/Momentus/Users/marc/Library/Preferences/MaxBack .mxbk

$ ls -l "/Volumes/Momentus/Users/marc/Dropbox/Camera Uploads/Icon^M" 
-rw-r--r--@ 1 marc  marc  0 Sep 10 23:28 /Volumes/Momentus/Users/marc/Dropbox/Camera Uploads/Icon?

$ ls -l "/Volumes/Momentus/Users/marc/Library/Application Support/Evernote/accounts/Evernote/msabramo/note.index/write.lock"
-rw-------  1 marc  marc  0 Sep 16 22:23 /Volumes/Momentus/Users/marc/Library/Application Support/Evernote/accounts/Evernote/msabramo/note.index/write.lock

$ ls -l "/Volumes/Momentus/Users/marc/Library/Application Support/Google/Chrome/Default/Extension State/LOCK"
-rw-------@ 1 marc  marc  0 Sep 10 06:07 /Volumes/Momentus/Users/marc/Library/Application Support/Google/Chrome/Default/Extension State/LOCK

This may or may not be affecting my backup appreciably, but the issue here is that when I’ve already had some difficulty getting Carbonite to do its backups, then when I look through the logs and notice other thing that seem incorrect, it further erodes my confidence in the product.

Issue 5: Carbonite wasting its time repeatedly checking files and directories that have long been deleted

Carbonite doesn’t log every single file it looks at (Thank goodness! That would be extremely inefficient.), so most folks wouldn’t even notice this one. I only did because my backup was not progressing and I kept seeing messages in the logs saying that Carbonite was waiting because CPU usage was too high, and Carbonite was the only app using a lot of CPU, so I went poking around to see what it was so busy doing.

$ sudo fs_usage | grep '/.*CarboniteDae'

08:33:34  getattrlist       //Users/marc/etsylister/parts>>>>>>>>>>>>>>>                                     0.000038   CarboniteDae
08:33:34  getattrlist       //Users/marc/etsylister/parts>>>>>>>>>>>>>>>                                     0.000007   CarboniteDae
08:33:34  getattrlist       //Users/marc/etsylister/parts>>>>>>>>>>>>>>>                                     0.000008   CarboniteDae
...

I cut this off at 3 entries, but in reality, I saw this 44 times in one second (and it happened for more than one second, but I stopped counting). This also isn’t the only directory that it’s continuing to check – there are several others.

The fact that an app is repeatedly accessing the same file over and over 44 times per second is concerning in itself. To add fuel to the fire, this directory no longer exists! See, I noticed this message many times in the past and I thought that I could give Carbonite a hand and get it past its compulsive checking by deleting this directory and a bunch of other ones that it was compulsively checking. So maybe a week ago, I went on a rampage and blew away a bunch of directories that Carbonite was obsesssing over that weren’t important to me. Problem is, Carbonite is still checking them! And this is even after I uninstalled Carbonite, blew away all its preferences and support files, rebooted, and reinstalled Carbonite. Either I missed deleting the file on my local filesystem that carries this state, or, more likely, the knowledge of this file is in Carbonite’s servers. Perhaps after some amount of time or some number of failed attempts to access the file (how many?), Carbonite will forget about these files. But in the meantime, I worry that it’s wasting its time checking stuff that’s gone and it could be better spending its time backing up the files that I do have.

Update 2012-09-27

I’ve accumulated the info in this post over the course of the last several weeks. At one point, I did a bunch of fiddling including a disk repair on my startup disk and managed to get Carbonite to make a bunch of progress so that only 70 or so files were pending backup (it’s quite possible that disk errors were confusing Carbonite). Still I wanted them all to be backed up or at least know which files those were that were not being backed up and why. The latest is that I talked to Carbonite support and had a fairly nice and helpful guy connect to my machine remotely and make Carbonite exclude some files that are supposedly problematic (browser caches and the like) and mess around with some Carbonite caches. This didn’t really make Carbonite catch up and back up the remaining 70 files. In fact, over the course of the last few days that number has grown to a little over 300. This suggests to me that Carbonite is having trouble with some of these files and is “stuck” and will likely fall further and further behind but I don’t know which files are the problematic ones and Carbonite support hasn’t yet been able to tell me which files these are.

Most likely, I will let my Carbonite subscription lapse when it comes up for renewal. I might go with CrashPlan, but I think more likely is that I will forego cloud backup services entirely and instead just clone my stuff to hard drives and store a couple in some offsite locations like the homes of friends and family. The knocks against cloud backup:

  • Hard to know whether it’s working – how do I really know which files the provider has and which they don’t?
  • Cloud backup uses a ton of my computer CPU and Internet and network bandwidth for months and months. When I do do backups myself, I kick it off and after a few hours it’s done.
  • When the time comes to restore files, I will have to download a tremendous amount of data or pay over a hundred dollars for them to ship me a hard drive. Even then, the hard drive or files downloaded will just be files; not a bootable image. Conversely if I go to a friend’s house and pick up a hard drive with a clone of my system, I have all of my files fast and I have something I can boot off of.
  • I could save the money on the subscription fee and use it towards my hard drive budget.

My custom Alfred.app searches

I thought others might find a couple of my custom searches and fallback searches for Alfred.app interesting. When I type some text that doesn’t match anything, here’s what I get:

Marc's Alfred.app custom and fallback searches

Marc’s Alfred.app custom and fallback searches

The “Jira search” entry searches my company’s local Jira server. You can add one for yourself by clicking this link and editing the URL in Alfred Preferences | Features | Web & URL | Custom Searches (this is in Alfred.app version 1.3.1).

The “OF Inbox” entry uses an omnifocus:// URL to send the text to your OmniFocus Inbox. Here’s a link to add this to your Alfred config.

Memeo backup = slow

I replaced a defective Seagate GoFlex portable hard drive a few days ago and the new one that Seagate sent me had Memeo backup software on it. I decided to give it a whirl.

I got it configured, but judging from how long it appears to take to do a complete backup, I don’t think it will be sticking around long.

22,023 hours remaining?!?!

I already have Carbon Copy Cloner for cloning my whole drive to a USB hard drive and I have Carbonite for doing backups of various important files such as pictures and music to the cloud. That is probably good enough.