... until the collector arrives ...

This "blog" is really just a scratchpad of mine. There is not much of general interest here. Most of the content is scribbled down "live" as I discover things I want to remember. I rarely go back to correct mistakes in older entries. You have been warned :)



I had trouble getting the CruiseControl JSP page that displays build results to work.  It kept getting a 'file not found' error for the log files.  The problem turned out to be caused by spaces in the pathname.  The spaces were being URL-escaped (to %20), but the file was being referenced directly, not as an URL.  This appears to be a bug in CruiseControl.

I have set up a skeletal data directory for a CruiseControl build server.  Since the CruiseControl config.xml does not support variables and is quite repetitive in structure, this skeleton uses Ant to generate the configuration from a simpler configuration file.



Apache's JMeter has sure come a long way since I last looked at it.  It seems to be capable of quite sophisticated web load testing.


SQL Server

In the absence of Enterprise Manager, here is how to create and configure a login using OSQL:

  1. sp_addlogin 'userid','password','defaultdb'
  2. sp_adduser 'userid'
  3. sp_addrolemember 'db_owner','userid'

To switch MSDE to use 'mixed mode' authentication, set the registry key:


to '2' ('1' is Windows authentication mode only, the default).


The key() function is sensitive to the current context node in that it will only find nodes that are in the current document.  This is surprising behaviour.  The XSLT specification actually notes this "feature" (read: bug) at the end of section 12.2.  It suggests using <xsl:for-each select="$document"> construct to switch the context prior to using the key function.  That wouldn't work for me in a stylesheet that I was working on because I needed an XPath expression.  This worked:  $root[key('...', ...)]



To configure a JDBC data source for a web application in JBOSS, perform the following steps:

  1. Add a reference to the data source in web.xml:
      <description>My Data Source</description>
  2. Map the reference name to a JBOSS-style JNDI name in the vendor-specific data source, jboss-web.xml:
  3. Create a data source for the JBOSS-style JNDI name by creating a data source file named my-data-source-ds.xml in the JBOSS deploy directory, containing:
        <connection-url>jdbc:some url</connection-url>
        <check-valid-connection-sql>select 1</check-valid-connection-sql>



Here is how to perform an XSLT transformation and write the result to a file:

var outputStream = new ActiveXObject("ADODB.Stream");
outputStream.Type = 1; // adTypeBinary
xmlDoc.transformNodeToObject(xslDoc, outputStream);
outputStream.SaveToFile("output.html", 2); // adSaveCreateOverWrite



IE has a habit of crashing randomly after certain XML operations fail.  For example, it is not unusual for IE to crash when attempting to apply an invalid XSLT against a large XML file.  The same appears to be true for invalid XSchema validations.  The solution is to close all running instances of IE and try again.  I have seen FrontPage exhibit 'weird' behaviour in these circumstances as well, but not crash (e.g. weird = unable to close a document).

I wrote a short WScript to perform XML validation:

if (WScript.Arguments.length != 1) {
  WScript.Echo("Usage: cscript validator.js ");

var url = WScript.Arguments(0);
var document = new ActiveXObject("Msxml2.DOMDocument.5.0");
if (document.load(url)) {
    WScript.Echo("The file is valid.\n" + url);
} else {
        "Unable to load the document.\n"
        + document.parseError.reason
        + "\n" + url);

When using it, I discovered that MSXML appears to ignore the XML Schema associated with a document when there is a DTD present.


VS.NET and MSDN Library

Here's a neat trick to fix the "Help is not installed in Visual Studio .Net" problem.  Change the value of the registry key:

HKEY_LOCAL_MACHINE\SOFTWARE\Microsoft\VisualStudio\7.1\Help\0x0409\{some guid}\VS_Docs

from 'Not Supported' to '7.1'.  Black magic, but it works (at least for VS.NET 2003).


I've been using sastrong as a decent default password for MSDE installations where security is not terribly important.



Server.MapPath resolves paths relative to the current page instead of the application base.  If you want something relative to the application base, use something like Server.MapPath("~/somedir").

GetType() doesn't work for ASPX classes.  Instead of returning the type of your class, it returns the type of some funky ASP.NET wrapper class in a system namespace.


.NET System.Xml

It is pretty easy to write extension functions for XSLT under .NET.  The following code performs a transformation that uses an extension function:

XslTransform xslt = new XslTransform();

XsltArgumentList arguments = new XsltArgumentList();
arguments.AddExtensionObject("urn:x", new Extension());
XPathDocument document = new XPathDocument("test.xml");
XmlTextWriter writer = new XmlTextWriter(Console.Out);
xslt.Transform(document, arguments, writer, null);

All of the public members of the extension class become available within the template as QNames whose prefix is mapped to the URI passed as the first argument to AddExtensionObject.  The following data types can be used as parameters or the return value:  String, Boolean, Double, XPathNavigator (for result tree fragments), and XPathNodeIterator (for node sets).

SQL Server 2000 XML

The simplest way to generate nested XML from SQL Server is to use the FOR XML AUTO syntax:

SELECT root.version, parent.id, child.name
FROM (SELECT 1 AS version) AS root
JOIN parent ON (1 = 1)
JOIN child ON (child.parent_id = parent.id)

This will produce output such as:

<root version="1"><parent id="..."><child name="..."/></parent></root>

If you use the FOR XML RAW syntax instead, then one row element will be generated for each result row, e.g.

<row version="1" id="..." name="..."/>

For complete, control of the result set shape, use FOR XML EXPLICIT:

  1 AS Tag,
  null as Parent,
  1 as [Root!1!version!hide],
  null AS [Case!2!id],
  null AS [ClaimantData!3!id!hide],
  null AS [ClaimantData!3!first_name],
  null AS [ClaimantData!3!last_name],
  null AS [ClaimantData!3!!xml]
  2 AS Tag,
  1 AS Parent,
FROM CaseData
  3 AS Tag,
  2 AS Parent,
  from ClaimantData
JOIN CaseData ON (ClaimantData.case_id = CaseData.id)
ORDER BY 3,4,5

This last option uses a so-called 'universal table' to define the XML document.  This first two columns assign tag numbers and parent-child relationships, and must be named Tag and Parent.  The remaining columns must contain the join keys and data, and be sorted into the desired document order.  The key and data columns must be named according to the convention ElementName!TagNumber!AttributeName!Directive.  The last two components are optional.  All this is quite ugly, but it does allow finer control of the final product.  Note, for example, how certain columns do not appear in the output (e.g. ClaimantData!3!id!hide) and how columns that contain XML data can be included in place (e.g. ClaimantData!3!!xml).

None of this stuff conforms to the emerging SQL/XML standard which, by all accounts, Microsoft has no intention of supporting.


Here is a gotcha:  if you attempt to protect the contents of a directory using a web.config file like:

<location path="data">
        <deny users="*"/>

... it won't work. ASP.NET only respects the configured permissions for files that it serves. Static directories, and most files in them, are served by IIS and must be protected using IIS settings.



A nasty gotcha: ASP.NET stores the session identifier as a global cookie.  This has adverse affects upon session state in circumstances such as:

  1. open your app in a browser window and browse to same page that carries session state
  2. open your app in a second browser window and browse to a different page that has noticably different session state
  3. return to the first browser and perform an action that relies upon the session state

You will notice that the first browser window has picked up state from the second.

This problem is not unique to ASP.NET, but affects any system that uses session cookies.  Depending upon the application, it might be safer to carry the session identifier in the URL or as variables on the page.



There is a way to monitor the ADO.NET connection pool.  Open the administrator applet Performance and add counters from the .Net CLR Data performance object.  There are various SqlClient counters.

In a related note, you can enable ODBC connection pool counters on the ODBC Data Source Administrator applet.  Click on the Connection Pooling tab and use the Enable setting under PerfMon.  I had to exit PerfMon and re-enter it to see the change.


NUnit Gotcha

If you neglect to make the constructor on a test fixture public, the NUnit test runner will complain that the fixture is invalid with the misleading reason 'the fixture contains no tests'.


Internet Explorer DOM

Using IE 6, when you create an INPUT node, you must set the type property prior to adding the node as a child of another element.  If you try to add it afterwards, IE complains that the type property cannot be read.


SQL Server

Here is an MS SQL statement that creates an SQL script containing INSERT statements for all of the data in all of the tables in the current schema:

    case when c.ordinal_position = 1
        then 'SELECT ''INSERT INTO "' + t.table_name + '" VALUES ('''
        else ''
    ' + coalesce('
        + case when patindex('%char%', c.data_type) <> 0
               or patindex('%text%', c.data_type) <> 0
               or patindex('%date%', c.data_type) <> 0
               or patindex('%time%', c.data_type) <> 0
            then ''''''''' + replace(' + c.column_name + ', '''''''', '''''''''''') + '''''''''
            else 'convert(varchar, ' + c.column_name + ')' end
        + ', ''NULL'')',
    case when c.ordinal_position <> all(select max(ordinal_position)
                                       from information_schema.columns
                                       where table_name = t.table_name )
        then ' + '','''
        else ' + '');'' FROM "' + t.table_name
            + case when t.table_name = all(select max(table_name)
                                           from information_schema.tables
                                           where table_type = 'BASE TABLE'
                                           and table_name <> 'dtproperties'
                  then '";' else '" UNION' end
from information_schema.tables as t
join information_schema.columns as c on (t.table_name = c.table_name)
where t.table_type = 'BASE TABLE'
and t.table_name <> 'dtproperties'
order by t.table_name, c.ordinal_position

It doesn't handle all the data types, but works well for small databases.  The statement could be adapted to other dialects of SQL.



In anticipation of demonstrating QJ to Schlumberger, I went looking for some open source E&P software, especially simulators, that I could try to hook into QJ.  I found Sim42, an open source chemical process simulator.  I also dragged SysQuake out of distant memory as a general purpose simulation package.


Visual Studio

The way Visual Studio 2003 handles web projects still sucks.  As I noted before, there is a way to work around most of the problems discussed at:

ASP.NET Applications without Web Projects

In the end, however, these were the steps I followed to get things to work in my project:

  1. Create a deploy directory to hold your assembled web application.  Add this directory to your web server instead of the default build outputs directory.
  2. Create a post-build script to copy the build outputs to the deploy directory, as well as other critical files like the *.aspx, Global.asax, Web.config, etc.  Note that the DLLs for the application must be contained in a directory tree rooted in a directory called bin.  A sample post-build script might look like:
    del/s/q $(ProjectDir)deploy
    xcopy/y/f $(ProjectDir)Web.config $(ProjectDir)deploy
    xcopy/y/f $(ProjectDir)Global.asax $(ProjectDir)deploy
    xcopy/y/f $(ProjectDir)*.aspx $(ProjectDir)deploy
    xcopy/y/f $(TargetDir)*.dll $(ProjectDir)deploy\bin
  3. Make sure you have a Web.config file and that the attribute /configuration/compilation/@debug is true.
  4. Edit the project's configuration properties so that:
    1. Debugging\Enable ASP.NET Debugging is true
    2. Debugging\Debug Mode is URL
    3. Debugging\Start URL is the URL of your page

Also, as a one-time operation on any given Visual Studio installation, I followed the steps on the web site referenced above to get make all of the web items available on the "new item" palette.  Specifically, I added the following lines to the of the file VC#\CSharpProjectItems\LocalProjectItems\localprojectitems.vsdir in the VS installation directory:




Stylus Studio uses a nice XML Schema documentation generator called xs3p.

Stylus Studio


 The expression context for the XPath query functionality does not define the default namespace.  Therefore, there is no way to search for elements in most documents, i.e. documents that have a non-empty default namespace.  How did they miss that one?

The query v.1 button keeps getting reselected when I type text into the query box even after I have explicitly chosen v.2.  I have to type first, then select v.2.

XPath2 querying is sketchy.  There is no mention of it in the documentation.  The expression "for $x in (1, 2, 3) return $x" generates a warning dialog box with no text in it.  "(1, 2, 3)" works.

Visual Studio

The way Visual Studio 2003 handles web projects sucks.  You must create a virtual directory on IIS that contains not only your binaries, but also your source.  Weird.  Also, VS2003 cannot handle filenames that start with a dot -- such as Subversion .SVN files.  I found an article that shows how to work with ASP.NET with a standard local project:

ASP.NET Applications without Web Projects


Stylus Studio

I discovered Stylus Studio, a great XML editor from Sonic.  It does all the usual XML, Schema, DTD stuff, but also edits XSLT and XQuery -- with interactive debugging.  I found out about it from a comment by Michael Kay on the Saxon mailing list.


Font Identification

I need to identify a font used on a web page and went looking for online tools for this purpose.  I ran into IdentiFont.  It uses a series of skill-testing questions to identify a font that you are looking at.

WhatTheFont, on the other hand, uses an uploaded image to identify a font.

Linotype's FONTIDENTIFIER is similar to IdentifFont... very similar.  Is it the same?


ADO and JScript

I had trouble trying to use a date parameter in a SQL statement using JScript and ADO.  Here is the answer:

var connection = new ActiveXObject("ADODB.Connection");
connection.ConnectionString = "Provider=...";
var command = new ActiveXObject("ADODB.Command");
command.ActiveConnection = connection;
command.CommandText = "INSERT INTO ... VALUES (..., ?, ...)";
command.Parameters.Item(0).value = messageTime.getVarDate();


Content Management Systems

Yesterday, a colleague spoke of an initiative at his wife's company to put some automated workflow processes in place.  The company had already talked to a consultant who proposed using Zope to meet their needs.  This led me into an investigation of Zope, and many other open source content management systems:

  • Plone - provides substantial access control and workflow features.  Layered on the Zope web application server using Zope CMF (content management framework).  One-step installation using a conventional installer.
  • TikiWiki - a wiki on steroids.  It claims to have pretty well every collaboration feature I can think of, including workflow support.  Requires a web server, PHP, and MySQL.
  • Magnolia - a Java-based (JSR-170) system.  Fairly light in the feature department.  Ease-of-use is its main claim.
  • OpenCms - Needs Java 1.4, Tomcat 4, and MySQL.  Page editing uses ActiveX controls in IE (with a fallback to a simple type-in box for non-IE browsers).
  • Typo3 - Needs PHP4 and MySQL.

CMS Watch is a good site for monitoring CMS developments.  cms matrix is another.



I found OpenWiki, a very lightweight and easy-to-install wiki for IIS+ASP.  The installation steps, using an MS Access database as the back end, are:

  1. Download the latest distribution.
  2. Copy data\OpenWikiDist.mdb to some location and rename it to something like MyWiki.mdb.
  3. Publish the owbase directory (or a copy thereof) in IIS.
  4. Edit owbase\ow.asp to include the following configuration directives:
    OPENWIKI_DB = "Driver={Microsoft Access Driver (*.mdb)};DBQ=h:\wherever\MyWiki.mdb"
    OPENWIKI_TITLE = "My Wiki"

That's it!  All of the configuration directives can be found in owbase\ow\owconfig_default.asp.



I played around with DOM and CSS some more, trying to get the calendar from yesterday to embed well.  I had discovered that the CSS that I was using to format the calendar could be easily overridden by CSS that applies to containing elements, causing the calendar to look bad in those circumstances.  For example, I had rules with selectors such as "table.calendar td".  Unfortunately, when I tried to embed the calendar in a containing document that used selectors such as "div.container td", the latter's rules would override the former's whenever the calendar nested within the container's selected elements.  In CSS 2, one can write much more picky selectors, such as "table.calendar > tbody > td" which are very specific.  Alas, CSS 2 is neither widespread, nor well implemented.  So, I rewrote the calendar HTML and CSS so that every calendar element that needed styling would have an unambiguous ID or class.  As an example of an embedding document, see 2004-08.html.



I tried to create an HTML page that used ECMAScript to generate a calendar.  First, I used document.write() to create it -- that worked in IE, Mozilla, and Firebird.  Then, I tried to use DOM methods.  This is much nastier.  First, I discovered that IE will not display a table that is created using raw createElement() calls for TR, TD, and TH elements.  The debugger reveals that the correct object model has been created, but IE just refuses to display it.  I then switched to use the insertRow() and insertCell() methods on the various table objects.  This mostly worked in the browsers, but:

  • Moz/FB put all rows into the THEAD instead of just the header rows.
  • None of the browsers created TH elements in the THEAD section -- they all used TDs.
  • IE insisted upon wrapping the text contained in the header at the first space -- whether it needed to wrap or not.

After much fidgeting, I just could not get the bugs to disappear.  I guess that it is best to use good old document.write()... but that means that one needs to do HTML escaping.

Update: I found a workable compromise.  First, I didn't bother with the THEAD element at all since Moz got it wrong.  Second, I used createElement("th") to create the header cells.  IE doesn't seem to choke on that.  The result is in calendar.html.



We had lots of inexplicable performance problems with the network today, starting right in the morning.  The problems appeared to be confined to the switch in the tech office.  We tried swapping switches and cables, with no effect.  I rebooted the main switch in the computer closet.  That appeared to fix the problems.


I also noticed some zero page, zero byte print jobs were being reported for my user id whenever I tried to browse a network drive. It turns out that TortoiseSVN scans all shares for .SVN files -- including printers!  Apparently, this causes these bogus print jobs to appear.


MS SQL Server 2005 Express

I discovered that there was a new, free, version of MSDE called MS SQL Server 2005 Express.  This next generation version of MSDE supports .Net 2.0 and XQuery.  I downloaded and installed .Net 2.0 beta and Express beta -- only to find that the whiz-bang client tools do not exist yet.  I will await those tools before continuing with my evaluation.


It turns out that I don't need to use UrlScan.  IIS doesn't serve up the .SVN directories for some reason.  Perhaps because they are hidden files?



I migrated our Intranet web site files from VSS to SVN.  I wanted to make the 'live' web directory an SVN working copy, but I didn't know how to configure IIS to not serve up the numerous .SVN files.  I was looking for an equivalent to the Apache 'limit' directives.  Apparently, there is no such concept on IIS.  However, Microsoft has a global ISAPI filter called UrlScan which can, among other things, be used to control whether files are served up based upon their extension.  UrlScan is configured by the file:


The default UrlScan permissions are fairly tight.  In order to avoid breaking any other applications (like ASP.NET or FrontPage extensions), I moved the UrlScan filter down the list in priority (on web site properties\ISAPI filters page).  Note that UrlScan loads as a 'high priority' filter unless you adjust the setting of the AllowLateScanning directive in urlscan.ini).



I installed the subclipse Eclipse plug-in for Subversion.  There were a some gotchas:

  • The zip file contained what looked to be a standard Eclipse plug-in directory structure but, in fact, the real plug-ins were nested more deeply in the structure.  In addition, there were some DLLs to throw in the main Eclipse directory.
  • Make sure that you exclude .svn directories from the build copying process (by adding '.svn' to the list of filtered resources on the Preferences/Java/Compiler/Build Path panel).  If you don't, subclipse thinks that all of the build product directories are under source control and gets very confused.  Also, you cannot rebuild the project as Eclipse will fail to delete all of the read-only files in the .svn directories.  You must delete the directories manually.



With VSS on the fritz, we decided to try out Subversion.  I spent the day getting svnserve running, creating a repository, and populating it from t:\java2.  We will try SVN for a little while and, if it works, will migrate more projects into it.  Ultimately, we hope to retire VSS.



Our VSS repository is done for.  It took an overnight job to delete a 15 meg directory tree.  The performance of the repository has been degrading for the last six months, but this takes the cake.  I tried to create a new repository, but it did not work.  Apparently, this behaviour is 'by design', at least according to an MS knowledge base article.  I tried using the workarounds in that article, plus suggestions in another article or two, but was unable to create a new repository.



The analyze I left running overnight appears to have hung.  It was part-way through pass 3 and apparently frozen 15+ hours into the job.  It would not abort when I attempted to close the window, so I had to kill the process.  I restarted the job, this time running it directly on the server.  *Sigh*  The console hung up about 1.5 hours into the job.  I guess this analyze is not meant to be.



I took a look at GraphViz.  It seems like it might be worth learning about this next time I have a visualization problem.


On the topic of source control, our VSS database is over 4gig and headed for 5.  Microsoft recommends limiting database size to 2gig and strongly recommends staying under 5.  First, I got a list of physical VSS files:

ss physical $/ -r –o@h:\desktop\physical.txt 

Then, I started an analyze:

analyze -f -v4 t:\vss\data

Actually, I ran it first without the 'fix' option (-f), but the end of the day arrived before the end of the analysis, so I interrupted it and restarted it in 'fix' mode to run over night.  Tomorrow I'll check out the results and try to figure out an archiving plan.  The problem is that when you archive only some projects, you have to make sure that you archive any projects that they share or (apparently) you will be unable to restore the archive.

Microsoft has a VSS Best Practices document.

For a couple of years I have been running into this alternate source control system called Subversion (SVN).  After reading a couple of more glowing testimonials today, I decided to find out what the buzz was about.  And now I know.  SVN handles deletes, renames, and moves better than VSS -- and better than CVS (which is easy since CVS cannot handle such things).  I am going to migrate to SVN as my personal sccs.  Migrating SSI might be more difficult, but not impossible.


Exchange Woes

Most of the morning was spent tracking down delays in email delivery for someone.  Last week he had noticed that a few emails were taking hours to arrive in his mailbox.  Today, he noticed that some outgoing emails were also being delayed.  I spent a fair amount of time inspecting ANTLIA to see whether the problem could be at our end.  The MS Exchange logs showed a typical incoming and outgoing flow of email traffic.  The NT event log was clean, except that McAfee GroupShield was complaining that disk space was low -- which was true for drive C.  I made space on drive C and rebooted ANTLIA for good measure.  In addition, I captured SMTP packets on ANTLIA for a couple of hours.  Analysis of those packets did not reveal anything interesting -- except that MS Exchange keeps trying to send non-delivery responses to spammers.  And MSEXCH 5.5 has no way to turn off NDRs.


I discovered that Mozilla Thunderbird supports NTLM authentication for POP3 (and IMAP).  It is enabled by checking the 'use secure authentication' checkbox on the account server settings panel.  No more cleartext passwords!

PLT Scheme

I stumbled across PLT Scheme.  A great teaching environment for Scheme, reminiscent of what BlueJ does for Java.


URL Addressable Gateways

I found Brazil while I was looking for a lightweight proxy server which could be extended to perform some gateway functions related to URL addressability.  The problem is that technologies such as XSLT and XQuery can only retrieve web resources that are URL-addressable and retrieved using a simple HTTP GET.  They cannot use HTTP POST and, by implication, web services.  Furthermore, there is no way to add headers to GET requests.  This rules out the various HTTP authentication schemes.  Even if headers were allowed, few URI resolvers will go through, say, the basic HTTP authentication handshake sequence.  The problem only worsens for higher-level protocols like SSL, WS-Security, etc.

There a number of approaches to solving this problem:

  1. Write custom code that accesses the web resource and then feeds it to the XSLT/XQuery processor.
  2. Write a custom URI resolver that accesses the web resource.
  3. Write extension functions (or elements) for the query processor.
  4. Introduce a gateway into the process converts simple GETs into the required requests.  This could be an HTTP proxy or a servlet/filter/handler.

Brazil Framework

I was playing with the Brazil Framework, a fairly lightweight HTTP stack.  It is useful for piecing together HTTP servers.  For example, one can serve up static content protected by HTTP basic authentication by chaining to built-in handlers together.  Brazil is configured using a Java properties file, eg:


chain.handlers=auth content

auth.realm=Test Content Realm
auth.message=You are not authorized!


Brazil would be invoked from the command line using this configuration thus:

java -jar brazil.jar -config myconfig.conf

For completeness, auth.mapfile contains a mapping of HTTP basic authentication strings (in Base64) to 'session identifiers' (which are more like user identifiers), eg:


which encodes me/myself as a user/password pair, mapped to someuser.



Don and I discovered a rather nasty bug in ASP.NET configuration.  If you define a security policy involving a UrlMembershipCondition with an invalid URL, the permission appears to be granted to everything (or a randomly selected collection of components)!!!  I wonder whether the problem applies to .NET generally...



There was much wringing of hands trying to get the wellview ASP.NET application working on Andromeda.  At first, it appeared that the app required more privilege than is granted by default by ASP.NET.  However, it turned out that the problem was that application debugging was turned on.  The default trust level did not permit this.  Turning off debugging in the application's web.config file solved the problem.  I had not noticed this behaviour while developing on Nemesis.  Apparently, the 'out-of-the-box' behaviour on that machine was to grant the required debugging permissions (full trust!) to applications.  There is no global web.config file and the usual .NET config tool did not show any relevant rules.  So... the question is why is full trust being granted on Nemesis?  Update: it is because of the file:


This file contains a lot of .NET settings.  ASP.NET is configured in the system.web section.

One handy-dandy trick to note: there is an 'application tracing' setting in the web.config file.  When it is turned on, you can view the trace log by hitting trace.axd on your application's URL.


Adobe SVG Plug-in

You can install the Adobe SVG Plug-in (v6.0) into Mozilla or Firefox by copying the files NPSVG6.DLL and NPSVG6.zip from Program Files\Common Files\Adobe\SVG Viewer 6.0\Plugins to the plugins directory of the browser.

NTLM Support for Servlets

jCIFS supports NTLM for Servlet containers!


I was using awstats to analyze IIS server logs.  It requires Perl.  At first I was using Perl installed under CygWin, but I found that it did not handle newlines properly.  So I switched to ActiveState.  That worked.  I found out that the default IIS log format is a bit damaged.  It is missing two key pieces of information that awstats needs:  date(!) and bytes transferred.  I wrote a Perl script that adds the correct date to the beginning of each log line (extracting it from comments in the log) and adds a zero as a dummy byte count:

my $date = "<unknown>";
foreach (<>)
    if (/^#Date: ([^ ]*)/) { $date = $1; print $_; }
    elsif (/^#/) { print $_; }
    else { chomp; print $date . " " . $_ . " 0\n"; }

This made it possible for awstats to perform its analysis.  awstats is a bit finicky about configuration.  In the wwwroot/cgi-bin of the distribution, I created a configuration file named (for example), awstats.nemesis.conf (where nemesis is the host name).  I changed:

  • LogFile to point to the log file to analyze
  • LogFormat to "date time c-ip cs-method cs-uri-stem sc-status %bytesd"
  • SiteDomain to nemesis.
  • DirData to ../data.
  • DirIcons to <distribution directory>/wwwroot/icon

Then, I created the statistics by running this command in the cgi-bin directory:

awstats.pl -config=nemesis -update

Finally, I created the web pages with the command:

    -config=nemesis ^

I had to copy the configuration file into ..\..\tools for this to work.  Also, the output directory had to exist.

All-in-all, I wasn't that impressed with the reports generated.



What I was trying to do was create a DOM object that contains all of the ASP request variables, and feed that into an XSL program that computes the page.  It was a bit tricky:

<%@ Language = "JScript" %>
var xslDoc = Server.createObject("Msxml2.FreeThreadedDOMDocument.3.0");

var xmlDoc = Server.CreateObject("Msxml2.DOMDocument.3.0");

var parametersDoc = Server.CreateObject("Msxml2.DOMDocument.3.0");
var parameters = parametersDoc.createElement("parameters");
for (var k = new Enumerator(Request.queryString); !k.atEnd(); k.moveNext()) {
    var key = k.item();
    for (var v = 1; v <= Request.queryString(key).count; ++v) {
        var p = parametersDoc.createElement(key);

var xslt = Server.createObject("Msxml2.XSLTemplate.3.0");
xslt.stylesheet = xslDoc;
var xslProc = xslt.createProcessor();
xslProc.input = xmlDoc;
xslProc.addParameter("parameters", parameters);
Response.contentType = "text/html";

The trickiness lies in the bolded code.  Note the use of an Enumerator object, that collections use one-based indices, and that collections are indexed using function-call notation instead of array-index notation.

The next step would be to persist state in a session variable, passing that state in as a document to the XSLt and storing back the updated state returned by the XSLt (hiding it, say, in the <HEAD> tag of the HTML).  Alternatively, there could be two XSLt programs involved:  one to compute the new state and one to render the output.


I was struggling with an ASP page, trying to get all of the request variables.  I was using JScript, of course, since I cannot stand to use VBScript.  As it turns out, one needs to take special measures when attempting to access collections in JScript.  The pattern looks like this:

for (var key = new Enumerator(Request.queryString); !key.atEnd(); key.moveNext()) {


Here is an ASP snippet that injects request parameters into XSLT to generate a web page:

<%@ Language = "JScript" %>
var myparam = String(Request.queryString("myparam"));

var xslDoc = Server.createObject("Msxml2.FreeThreadedDOMDocument.4.0");

var xmlDoc = Server.CreateObject("Msxml2.DOMDocument.4.0");

var xslt = Server.createObject("Msxml2.XSLTemplate.4.0");
xslt.stylesheet = xslDoc;
var xslProc = xslt.createProcessor();
xslProc.input = xmlDoc;
xslProc.addParameter("myparam", myparam);


The LongVarChar problem from yesterday was solved by switching from an ODBC driver to an OLEDB provider.


When interfacing the WellView calculation engine to ASP.NET, I had to use the interop bridge to ADO so that WellView could share the data connection.  When I do this, cannot retrieve "text" fields (e.g. LongVarChar).  As usual.  It seems very few components can access these fields.  A quick solution was not forthcoming.

I stumbled across a good newbie XQuery compiler message: "no context for path expression".  It might mean you forgot to prefix a variable with a dollar sign.


While writing an XQuery, I ran into a problem with Saxon 7.9.1.  The following query:

typeswitch ("x")
case $e as element() return $e
default $x return $x

caused this error:

java.lang.IllegalStateException: Variable $x has not been fixed up

I reported the error to the saxon-help@lists.sourceforge.net mailing list.



A colleague was getting as SecurityException while running a .Net WebPart (MS-speak for portlet) under SharePoint.  We poked around to figure out how to grant permissions to WebParts.  The usual .Net config editor approach didn't work.  After a bit of googling, we turned up:

Microsoft Windows SharePoint Services and Code Access Security


I figured out how to import a sample LDIF file into OpenLDAP:

ldapadd -f sample.ldif -x -D \
    "cn=administrator,dc=somedomain,dc=com" -w secret

SSL Problems

A colleague and I had a hard time getting our IIS servers to accept client side certificates.  It turns out the problem was that we had installed the SSI-TEST-CA certificate in the trusted root of our own user accounts, not in the machine trusted root.  He had an even more difficult time of it since SSL on his server became completely disabled for reasons unknown.  He and I spent most of the day trying to troubleshoot it, with no result.  The problems were:

  • At first, client-side certificates would not be recognized.
  • Later, server-side certificates started failing.
  • Finally, HTTPS failed completely.

This degradation occurred as he was uninstalling and re-installing certificates.  We tried:

  • blowing all the certificates away and re-installing them
  • uninstalling IIS and re-installing it
  • hacking at IIS's metabase.bin file using metaedit
  • restoring the system to an earlier restore point (this one worked, but when we tried to reconfigure IIS for SSL, it quickly degraded again).

We are stumped.


Java keytool

The java keytool cannot import or export private keys.  In particular it cannot handle PKCS12 files (although the J2EE version of keytool can).  As a workaround, I downloaded PKCS12Import.java from the Jetty project.  It can create a JKS keystore file from a PKCS12 file.


I investigated trying to install an LDAP server for testing purposes.  I looked at OpenLDAP, but there it wants to run on Unix, not Win32.  I installed the Windows Server 2003 Admin Pack and investigated using Active Directory for LDAP.  It turns out that you cannot run Active Directory unless the server is a domain controller -- which is not going to happen.  Back to OpenLDAP...

I downloaded OpenLDAP for Windows from Lucas Bergman's site.  I referred to the OpenLDAP admin guide and installation steps in someone's homework assignment.  Installation:

  • changed all of the paths in slapd.conf to relative paths in the appropriate installation directory
  • changed the database suffix and rootdn parameters to appropriate values
  • ran slapd -- didn't work, no output.  Ran it again with the debug switch, -d 1.  It was complaining that it could not find the slapd.conf file.  I ran it again using the command line slapd -f etc/slapd.conf -d 1.
  • Now it is complaining that ucdata is not a valid directive and, later, 'error loading ucdata (error -127)'.
  • I tried switching from the BDB backend to the LDBM backend.  No change.
  • The slapd man page does not mention anything about ucdata.
  • I downloaded the source and discovered that there is an undocumented directive named ucdata-path.  I changed the config file to use this directive and, voila, slapd is running.

I spent the afternoon at the POSC WITSML SIG meeting.


I continued working on the 'web services' investigation project.  I spent time reading up on the Microsoft Office XP Web Services Toolkit.  I downloaded the toolkit and generated some VBA from our well pilot WSDL.  That's as far as I got.

I am still trying to decide whether to focus our initial web efforts on SOAP web services, or URI addressable HTTP GETs.  I tried to locate a paper I read on this topic, but was unable to find it.  I thought it was in Fielding's REST paper, but it wasn't.  I also poked around W3C's TAG site.

I helped a colleague work through certificate generation for IIS.  We ended up using OpenSSL (in CygWin) to generate the certificates, and they worked fine.  I rooted around in my archives to locate any docs I had about certificate generation, and I found some old stuff about Microsoft's MAKECERT (in my old diary!).  I have summarized certificate generation in a document.

A few days ago I captured some miscellaneous Java lore.


Finished up experimentation with using SSL with SOAP in Apache AXIS.  The Eclipse project can be found in axis-test.  Of particular interest is the document about configuring Tomcat/AXIS to use SSL.


I installed synergy, OSS that creates a virtual desktop out of the monitors from any number of machines and that allows you to control that virtual desktop from a single keyboard and mouse.  It supports copy-and-paste of text across the systems as well.

It has some idiosyncrasies.  Sometimes the client software will fail to connect to the client after a reboot.  Also, sometimes the mouse freezes during heavy processing (on either side of the connection).  Finally, the mouse occasionally enters a state such as thinking a button is being held down when it is not.  However, all of these occurrences are rare, and synergy is very pleasant to use.


I discovered that the Microsoft XSD (and WSDL) tools cannot handle cyclic data structures.  That limits there usefulness I think.  Take a look at cyclic.xsd and the generated cyclic.cs.


A nifty utility for troubleshooting .NET assembly load errors is FUSLOGVW.EXE.  It can be found in the .NET SDK\bin directory.  It is a bit finicky, though.  The 'view log', and 'delete' buttons apply to the current selection.  You can only select a row by clicking in the leftmost column.

Normally, only failures are logged (provided the 'log failures' checkbox is checked).  However, a couple of registry settings will control what else is logged:

  • HKLM\Software\Microsoft\Fusion\ForceLog = (dword)1 logs all binds
  • HKLM\Software\Microsoft\Fusion\LogResourceBinds = (dword)1 logs failures to satellite assemblies

Blog Archive