October 15, 2011
Regardless of how you feel about app stores, software repositories / package managers, stealth background updating, there’s no question the process of discovering software, keeping it up-to-date, and disposing of it if need be leaves much to be desired on most platforms.
Dependency tracking, for example, is great when it works, but frustrating if not impossible to deal with for mere mortals when it doesn’t. A central my-way-or-the-highway authority, as epitomized by Apple’s App Store1, leads to high quality (except when it doesn’t), but may also keep out programs you would have been interested in. Tools like Adobe ARM, Google Omaha and soon Mozilla’s equivalent constantly take up resources, open potential security issues of their own, and may make decisions against your will (what if you preferred the old version?), yet possibly still in your interest (what if you would have overlooked that security patch?).
A whole book could be written about this mess. It’s a situation where Desktop Linux feels, at least superficially, way ahead, and Windows dead last. To add insult to the injury of having to manually find and update software on Windows, it’s also the most likely OS where you might wnid up downloading something malicious.
However, that’s not the problem I had. I know, for the most part, which additional software on top of the OS itself and my own stuff I care about. However, I don’t necessarily know (and I certainly don’t care to keep track) whether each of those apps is installed and current on all machines I happen to be responsible for. That problem is irrelevant for most — why would you have more than one, or at most two (a desktop and a laptop), machines?
But perhaps you, like me, also take care of other machines — in my case, colleagues’ workstations, in-house servers, and servers at a remote data center. Plus, of course, all those virtual machines. In that case, you may be quick to say “well, why don’t you just use Active Directory, group policies and IntelliMirror to deploy the newest stuff”. I could, and I do, but it’s a pain, and it doesn’t really help at all for the fourth category: machines that aren’t local.
And did I mention it’s a pain? It’s not just Adobe who keep reinventing how Windows Installer is supposed to work; Microsoft themselves seem quite confused on the concept, too. Try deploying .NET Framework 4 via group policy, as a pure, single, simple
.msi file (batch scripts are cheating). Or try keeping Adobe Flash Player up-to-date. Or, really, close your eyes, spin a giant wheel-o’-apps, point at one at random, and then figure out how to deploy it. This technology came 11 years ago with Windows 2000, and while it may save time in large organizations when granular control of who gets what matters, it simply doesn’t work well on a smaller scale.
Let’s scrap the idea of central deployment and instead get back to considering each machine individually.
What if we could have a Linux-style package manager in Windows? There are a few, actually. One, NuGet, has been generating some buzz lately, but is focused entirely on extending Visual Studio with libraries you want to use. Not what we’re looking for. With Raktajino-Packagemanager, we’re getting warmer; this one, however, is meant to be used for servers. I stumbled upon it because it was pre-installed on a machine from a hosting provider; perhaps it was written by an employee of theirs.
Npackd comes very close. By Linux standards, it’s rather limited: you can set a path prefix for all installed packages, and add additional repository sources. Packages can depend on each other, but there’s no customizing a package before installing it.
I’ve been using this for a week or so on my regular work VM, and have been 1) extending the somewhat limited selection using my own repository, and 2) pondering to what extend I could use this, at the very least, for that fourth category: servers. The default repository already comes with vital tools such as Process Explorer, for which I had previously used a batch file to keep it current; now, no more need for that.
A few issues are in the way:
- This may be a question of lacking maturity, but I would have expected its otherwise conceptually sound repository format to handle standard cases like “here’s a
httpURL pointing at a
.msifile — figure out how to install/uninstall it automatically” well. Rather, most packages seems to have some odd wrapper code.
- For some packages like CCleaner, the default install it uses is unacceptable. It installs some Bing Toolbar or whatever. It could come bundle with the world’s best additional software, and I still wouldn’t care because I didn’t ask for it, it’s from an untrusted third-party, and I never explicitly opted in to trust it to do that. Worse, with each update, those defaults are reset. So, for such packages, even though it offers them, Npackd is absolutely not an option. I suppose it’s possible that modifying the package to remove the option by default would violate the license, which really wouldn’t help CCleaner’s case.2 On a related matter, I’m not sure how to handle an install like WinSSHD’s. What if you have a commercial license with serial number? Would each upgrade remove this?
- What is touted as “multiple program versions can be installed side-by-side” is really an anti-feature for my needs (your mileage may vary). It works by placing each version in its own directory. I’m really not sure how much of an advantage this is in practice, given that it’s no real sandboxing — multiple versions presumably still share the same registry keys, and the same user application data. But in my case, it’s a major detriment: I have scripts that access 7-zip from the command-line, hard-coded to find its path. AnkhSVN, a Subversion add-in for Visual Studio has various hard-coded paths to diff and merge tools, such as WinMerge. Both 7-zip and WinMerge are provided by Npackd’s default repository, but even if I manually tell the scripts and AnkhSVN where to find them, I cannot rely on its update mechanism, because that will break paths each time. I can see the appeal, but in its current state, I wish the whole mechanism were optional. Better yet, it should allow for smooth “point to latest version” linking. With WinMerge, I’ve set up a junction point3, which makes AnkhSVN succeed to “discover” it once more, but this will break with the next upgrade. I can’t even seem to write a tool that automatically generates junctions, though, because the directory structure is not consistent across packages. Setting a PATH is also not much of an option, it appears. This goes back to my point above about the wrapper code; Npackd ought to not let it up to package maintainers where individual files are placed.
Some minor issues mostly related to batch processing:
- There’s no option to install multiple packages at a time,
- …nor to export a selection of packages and import that to another machine (much less synchronize such lists),
- …nor to update everything that’s out of date, rather than just individual packages,
- …and lastly, aside from a pony, I’d also like a Windows Service to keep packages up-to-date without any user intervention (or admin privileges).
Ignoring the troubling side-by-side feature for a moment, I’d consider Npackd a net gain. There’s a command-line tool I haven’t played much with as of yet, and that may well be enough to let me accomplish the above four. Find a selection of packages, write a batch script, and push it to multiple servers.
But one can always dream of more, right?4
I reject the notion that you could possibly trademark this.↖
I get that Piriform has to make money. I just don’t accept the method.↖
Sort of similar to Unix symlinks, though starting with Vista, NTFS also has symlinks.↖
And if it weren’t written in C++/Qt, one could even contribute oneself to make the ‘more’ happen!↖