I discovered that there was a new, free, version of MSDE called MS SQL Server 2005 Express. This next generation version of MSDE supports .Net 2.0 and XQuery. I downloaded and installed .Net 2.0 beta and Express beta -- only to find that the whiz-bang client tools do not exist yet. I will await those tools before continuing with my evaluation.
... until the collector arrives ...
I migrated our Intranet web site files from VSS to SVN. I wanted to make the 'live' web directory an SVN working copy, but I didn't know how to configure IIS to not serve up the numerous .SVN files. I was looking for an equivalent to the Apache 'limit' directives. Apparently, there is no such concept on IIS. However, Microsoft has a global ISAPI filter called UrlScan which can, among other things, be used to control whether files are served up based upon their extension. UrlScan is configured by the file:
The default UrlScan permissions are fairly tight. In order to avoid breaking any other applications (like ASP.NET or FrontPage extensions), I moved the UrlScan filter down the list in priority (on web site properties\ISAPI filters page). Note that UrlScan loads as a 'high priority' filter unless you adjust the setting of the AllowLateScanning directive in urlscan.ini).
I installed the subclipse Eclipse plug-in for Subversion. There were a some gotchas:
- The zip file contained what looked to be a standard Eclipse plug-in directory structure but, in fact, the real plug-ins were nested more deeply in the structure. In addition, there were some DLLs to throw in the main Eclipse directory.
- Make sure that you exclude .svn directories from the build copying process (by adding '.svn' to the list of filtered resources on the Preferences/Java/Compiler/Build Path panel). If you don't, subclipse thinks that all of the build product directories are under source control and gets very confused. Also, you cannot rebuild the project as Eclipse will fail to delete all of the read-only files in the .svn directories. You must delete the directories manually.
With VSS on the fritz, we decided to try out Subversion. I spent the day getting svnserve running, creating a repository, and populating it from t:\java2. We will try SVN for a little while and, if it works, will migrate more projects into it. Ultimately, we hope to retire VSS.
Our VSS repository is done for. It took an overnight job to delete a 15 meg directory tree. The performance of the repository has been degrading for the last six months, but this takes the cake. I tried to create a new repository, but it did not work. Apparently, this behaviour is 'by design', at least according to an MS knowledge base article. I tried using the workarounds in that article, plus suggestions in another article or two, but was unable to create a new repository.
The analyze I left running overnight appears to have hung. It was part-way through pass 3 and apparently frozen 15+ hours into the job. It would not abort when I attempted to close the window, so I had to kill the process. I restarted the job, this time running it directly on the server. *Sigh* The console hung up about 1.5 hours into the job. I guess this analyze is not meant to be.
On the topic of source control, our VSS database is over 4gig and headed for 5. Microsoft recommends limiting database size to 2gig and strongly recommends staying under 5. First, I got a list of physical VSS files:
ss physical $/
Then, I started an analyze:
analyze -f -v4 t:\vss\data
Actually, I ran it first without the 'fix' option (-f), but the end of the day arrived before the end of the analysis, so I interrupted it and restarted it in 'fix' mode to run over night. Tomorrow I'll check out the results and try to figure out an archiving plan. The problem is that when you archive only some projects, you have to make sure that you archive any projects that they share or (apparently) you will be unable to restore the archive.
Microsoft has a VSS Best Practices document.
For a couple of years I have been running into this alternate source control system called Subversion (SVN). After reading a couple of more glowing testimonials today, I decided to find out what the buzz was about. And now I know. SVN handles deletes, renames, and moves better than VSS -- and better than CVS (which is easy since CVS cannot handle such things). I am going to migrate to SVN as my personal sccs. Migrating SSI might be more difficult, but not impossible.
Most of the morning was spent tracking down delays in email delivery for someone. Last week he had noticed that a few emails were taking hours to arrive in his mailbox. Today, he noticed that some outgoing emails were also being delayed. I spent a fair amount of time inspecting ANTLIA to see whether the problem could be at our end. The MS Exchange logs showed a typical incoming and outgoing flow of email traffic. The NT event log was clean, except that McAfee GroupShield was complaining that disk space was low -- which was true for drive C. I made space on drive C and rebooted ANTLIA for good measure. In addition, I captured SMTP packets on ANTLIA for a couple of hours. Analysis of those packets did not reveal anything interesting -- except that MS Exchange keeps trying to send non-delivery responses to spammers. And MSEXCH 5.5 has no way to turn off NDRs.
I found Brazil while I was looking for a lightweight proxy server which could be extended to perform some gateway functions related to URL addressability. The problem is that technologies such as XSLT and XQuery can only retrieve web resources that are URL-addressable and retrieved using a simple HTTP GET. They cannot use HTTP POST and, by implication, web services. Furthermore, there is no way to add headers to GET requests. This rules out the various HTTP authentication schemes. Even if headers were allowed, few URI resolvers will go through, say, the basic HTTP authentication handshake sequence. The problem only worsens for higher-level protocols like SSL, WS-Security, etc.
There a number of approaches to solving this problem:
- Write custom code that accesses the web resource and then feeds it to the XSLT/XQuery processor.
- Write a custom URI resolver that accesses the web resource.
- Write extension functions (or elements) for the query processor.
- Introduce a gateway into the process converts simple GETs into the required requests. This could be an HTTP proxy or a servlet/filter/handler.
I was playing with the Brazil Framework, a fairly lightweight HTTP stack. It is useful for piecing together HTTP servers. For example, one can serve up static content protected by HTTP basic authentication by chaining to built-in handlers together. Brazil is configured using a Java properties file, eg:
auth.realm=Test Content Realm
auth.message=You are not authorized!
Brazil would be invoked from the command line using this configuration thus:
java -jar brazil.jar -config myconfig.conf
For completeness, auth.mapfile contains a mapping of HTTP basic authentication strings (in Base64) to 'session identifiers' (which are more like user identifiers), eg:
which encodes me/myself as a user/password pair, mapped to someuser.
- ► 2012 (27)
- ► 2011 (8)
- ► 2010 (25)
- ► 2009 (51)
- ► 2008 (78)
- ► 2007 (94)
- ► 2006 (135)
- ► 2005 (58)
- ▼ Jun 2004 (15)
- ► 1999 (10)