Skip to content
Jun 5 10

SSD Compiler Benchmark

by Alex Peck

Anandtech has some good articles illustrating the type of performance gains you might expect using an SSD disk for everyday tasks. Using Visual Studio, and in particular, compilation, are not considered. I did a cursory inspection of the rest of the Internet, and found no decent benchmarks investigating the impact of SSDs on compile time. So, I decided to perform my own. It’s well known that a lot of the operations performed by Visual Studio are disk bound, so it seems like there is some potential.

I discussed this with colleagues at work, and someone suggested that compiling managed code is more CPU intensive than native. This would suggest that it might be CPU bound, rather than disk bound. I therefore decided to test both native and managed code compilation, using both a conventional hard disk drive and a solid state drive.

Test Setup

I used my ageing PC running Windows 7 x64 Ultimate and Visual Studio 2010 Ultimate. For each disk I did a clean install and didn’t apply any updates.

I ran my tests on the following:

I selected the following test code to compile, based on it being relatively fast and self contained (I didn’t want to spend hours doing this!), and be fairly representative of a small system (compiling multiple dependent binaries):

  • Microsoft Enterprise Library 5.0 – a small set of managed components written by the Microsoft Patterns & Practices team.
  • LAME, the open source MP3 encoding library. I used version 3.98.4, and did a little hacking to get it to build against a recent GTK+.

Results

I compiled each test program three times and took the average result. I turned on build timing in Visual Studio by going to Tools->Options->Projects and Solutions->Build and Run->MSBuild project build output verbosity=Detailed. VS2010 now uses MSBuild for native code.

Compile time in seconds. Lower is better.

So, using an SSD we get around a 21% gain in speed on native code, and a 16% gain compiling managed. Not bad. Although my Raptor isn’t the most recent model, it is representative of the fastest conventional SATA hard disk available. On this basis, I would expect a RAID setup based on 10k RPM disks to outperform the SSD.

May 9 10

P/Invoke Interop Assistant

by Alex Peck

After messing about for hours trying to p/invoke GetTokenInformation only for it to return what appeared to be sinographs, I came accross the P/Invoke interop assistant on CodePlex.

The P/Invoke Interop Assistant: it generates some useful stuff

Although this didn’t solve the my problem (which was correctly marshalling a fixed size char array), it would doubtlessly have saved me time writing the boiler plate structs, enums and method signatures. Highly recommended.

May 6 10

Parsing HTML tables into System.Data.DataTable

by Alex Peck

What follows is a quick and dirty class I made to parse HTML tables into DataTables. As usual, it is the result of internet search/run/bug fix/refactor.

In use, it looks little like this:

WebClient client = new WebClient();
string html = client.DownloadString(@"http://www.table.co.uk");
DataSet dataSet = HtmlTableParser.Parse(html);

Here is the implementation. It’s not optimised for runtime performance, but it works.

/// <summary>
/// HtmlTableParser parses the contents of an html string into a System.Data DataSet or DataTable.
/// </summary>
public class HtmlTableParser
{
    private const RegexOptions ExpressionOptions = RegexOptions.Singleline | RegexOptions.Multiline | RegexOptions.IgnoreCase;
 
    private const string CommentPattern = "<!--(.*?)-->";
    private const string TablePattern = "<table[^>]*>(.*?)</table>";
    private const string HeaderPattern = "<th[^>]*>(.*?)</th>";
    private const string RowPattern = "<tr[^>]*>(.*?)</tr>";
    private const string CellPattern = "<td[^>]*>(.*?)</td>";
 
    /// <summary>
    /// Given an HTML string containing n table tables, parse them into a DataSet containing n DataTables.
    /// </summary>
    /// <param name="html">An HTML string containing n HTML tables</param>
    /// <returns>A DataSet containing a DataTable for each HTML table in the input HTML</returns>
    public static DataSet ParseDataSet(string html)
    {
        DataSet dataSet = new DataSet();
        MatchCollection tableMatches = Regex.Matches(
            WithoutComments(html),
            TablePattern,
            ExpressionOptions);
 
        foreach (Match tableMatch in tableMatches)
        {
            dataSet.Tables.Add(ParseTable(tableMatch.Value));
        }
 
        return dataSet;
    }
 
    /// <summary>
    /// Given an HTML string containing a single table, parse that table to form a DataTable.
    /// </summary>
    /// <param name="tableHtml">An HTML string containing a single HTML table</param>
    /// <returns>A DataTable which matches the input HTML table</returns>
    public static DataTable ParseTable(string tableHtml)
    {
        string tableHtmlWithoutComments = WithoutComments(tableHtml);
 
        DataTable dataTable = new DataTable();
 
        MatchCollection rowMatches = Regex.Matches(
            tableHtmlWithoutComments,
            RowPattern,
            ExpressionOptions);
 
        dataTable.Columns.AddRange(tableHtmlWithoutComments.Contains("<th")
                                       ? ParseColumns(tableHtml)
                                       : GenerateColumns(rowMatches));
 
        ParseRows(rowMatches, dataTable);
 
        return dataTable;
    }
 
    /// <summary>
    /// Strip comments from an HTML stirng
    /// </summary>
    /// <param name="html">An HTML string potentially containing comments</param>
    /// <returns>The input HTML string with comments removed</returns>
    private static string WithoutComments(string html)
    {
        return Regex.Replace(html, CommentPattern, string.Empty, ExpressionOptions);
    }
 
    /// <summary>
    /// Add a row to the input DataTable for each row match in the input MatchCollection
    /// </summary>
    /// <param name="rowMatches">A collection of all the rows to add to the DataTable</param>
    /// <param name="dataTable">The DataTable to which we add rows</param>
    private static void ParseRows(MatchCollection rowMatches, DataTable dataTable)
    {
        foreach (Match rowMatch in rowMatches)
        {
            // if the row contains header tags don't use it - it is a header not a row
            if (!rowMatch.Value.Contains("<th"))
            {
                DataRow dataRow = dataTable.NewRow();
 
                MatchCollection cellMatches = Regex.Matches(
                    rowMatch.Value,
                    CellPattern,
                    ExpressionOptions);
 
                for (int columnIndex = 0; columnIndex < cellMatches.Count; columnIndex++)
                {
                    dataRow[columnIndex] = cellMatches[columnIndex].Groups[1].ToString();
                }
 
                dataTable.Rows.Add(dataRow);
            }
        }
    }
 
    /// <summary>
    /// Given a string containing an HTML table, parse the header cells to create a set of DataColumns
    /// which define the columns in a DataTable.
    /// </summary>
    /// <param name="tableHtml">An HTML string containing a single HTML table</param>
    /// <returns>A set of DataColumns based on the HTML table header cells</returns>
    private static DataColumn[] ParseColumns(string tableHtml)
    {
        MatchCollection headerMatches = Regex.Matches(
            tableHtml,
            HeaderPattern,
            ExpressionOptions);
 
        return (from Match headerMatch in headerMatches
                select new DataColumn(headerMatch.Groups[1].ToString())).ToArray();
    }
 
    /// <summary>
    /// For tables which do not specify header cells we must generate DataColumns based on the number
    /// of cells in a row (we assume all rows have the same number of cells).
    /// </summary>
    /// <param name="rowMatches">A collection of all the rows in the HTML table we wish to generate columns for</param>
    /// <returns>A set of DataColumns based on the number of celss in the first row of the input HTML table</returns>
    private static DataColumn[] GenerateColumns(MatchCollection rowMatches)
    {
        int columnCount = Regex.Matches(
            rowMatches[0].ToString(),
            CellPattern,
            ExpressionOptions).Count;
 
        return (from index in Enumerable.Range(0, columnCount)
                select new DataColumn("Column " + Convert.ToString(index))).ToArray();
    }
}

As always, here are the tests. They yield 100% coverage but I still need to add some asserts on the column names.

/// <summary>
/// Tests for the HtmlTableParser class
/// </summary>
[TestClass]
public class ParserTest
{
    private TestContext testContextInstance;
 
    /// <summary>
    /// Verify that HtmlTableParser can parse an HTML file containing a single table. The
    /// test file includes a commented out table which should be ignored. Note some tags use
    /// attributes (we test we can parse tags with and without attributes).
    /// </summary>
    [TestMethod]
    [DeploymentItem(@"data\singleTable.txt")]
    public void TestParseSingleTable()
    {
        string html = File.ReadAllText("singleTable.txt");
        DataTable table = HtmlTableParser.ParseTable(html);
 
        AssertTable(GetExpectedData(), table);
    }
 
    /// <summary>
    /// Verify that HtmlTableParser can parse an HTML file containing multiple tables. The
    /// test file includes a commented out table which should be ignored. The test file
    /// contains tables both with and without headers.
    /// </summary>
    [TestMethod]
    [DeploymentItem(@"data\multipleTables.txt")]
    public void TestParseMultipleTables()
    {
        string html = File.ReadAllText("multipleTables.txt");
        DataSet dataSet = HtmlTableParser.ParseDataSet(html);
        Assert.AreEqual(3, dataSet.Tables.Count);
 
        var expected = GetExpectedData();
 
        foreach (DataTable table in dataSet.Tables)
        {
            AssertTable(expected, table);
        }
    }
 
    private static string[][] GetExpectedData()
    {
        return new[]
        {
            new[] { "row 1, cell 1", "row 1, cell 2" },
            new[] { "row 2, cell 1", "row 2, cell 2" }
        };
    }
 
    private static void AssertTable(string[][] expected, DataTable table)
    {
        Assert.AreEqual(expected.Count(), table.Rows.Count, "Table did not contain the expected number of rows");
 
        for (int i = 0; i < expected.Count(); i++)
        {
            for (int j = 0; j < expected[i].Count(); j++)
            {
                string actualElement = (table.Rows[i][j] as string).Trim();
                string expectedElement = expected[i][j];
 
                Assert.AreEqual<string>(expectedElement, actualElement, "Table did not contain the expected element");
            }
        }
    }
}

These are the test files, which are just some basic HMTL.

<!DOCTYPE html PUBLIC "-//W3C//DTD XHTML 1.0 Strict//EN" "http://www.w3.org/TR/xhtml1/DTD/xhtml1-strict.dtd">
<html xmlns="http://www.w3.org/1999/xhtml">
<head>
      <meta http-equiv="Content-Type" content="text/html; charset=iso-8859-1" />
    <meta name="robots" content="all" />
      <title>Title</title>
</head>
 
<body>
      <!-- This table is commented out, so it shouldn't be parsed.
      <table border="1">
            <tr  border="1">
                  <th>Commented Heading 1</th>
                  <th>Commented Heading 2</th>
            </tr>
            <tr  border="1">
                  <td>Commented row 1, cell 1</td>
                  <td>Commented row 1, cell 2</td>
            </tr>
            <tr  border="1">
                  <td>Commented row 2, cell 1</td>
                  <td>Commented row 2, cell 2</td>
            </tr>
      </table>
      -->
 
      <!-- The parser should ignore the border attributes -->
      <table border="1">
            <tr  border="1">
                  <th>Heading 1</th>
                  <th>Heading 2</th>
            </tr>
            <tr>
                  <td border="1">row 1, cell 1</td>
                  <td>row 1, cell 2</td>
            </tr>
            <tr  border="1">
                  <td>row 2, cell 1</td>
                  <td>row 2, cell 2</td>
            </tr>
      </table> 
</body>
</html>
<!DOCTYPE html PUBLIC "-//W3C//DTD XHTML 1.0 Strict//EN" "http://www.w3.org/TR/xhtml1/DTD/xhtml1-strict.dtd">
<html xmlns="http://www.w3.org/1999/xhtml">
<head>
      <meta http-equiv="Content-Type" content="text/html; charset=iso-8859-1" />
    <meta name="robots" content="all" />
      <title>Title</title>
</head>
 
<body>
      <!-- The parser should ignore the border attributes -->
      <table border="1">
            <tr  border="1">
                  <td>row 1, cell 1</td>
                  <td>row 1, cell 2</td>
            </tr>
            <tr  border="1">
                  <td>row 2, cell 1</td>
                  <td>row 2, cell 2</td>
            </tr>
      </table> 
      <!-- This table is commented out, so it shouldn't be parsed.
      <table border="1">
            <tr  border="1">
                  <th>Commented Heading 1</th>
                  <th>Commented Heading 2</th>
            </tr>
            <tr  border="1">
                  <td>Commented row 1, cell 1</td>
                  <td>Commented row 1, cell 2</td>
            </tr>
            <tr  border="1">
                  <td>Commented row 2, cell 1</td>
                  <td>Commented row 2, cell 2</td>
            </tr>
      </table>
      -->
      <table border="1">
            <tr  border="1">
                  <th>Heading 1</th>
                  <th>Heading 2</th>
            </tr>
            <tr  border="1">
                  <td>row 1, cell 1</td>
                  <td>row 1, cell 2</td>
            </tr>
            <tr  border="1">
                  <td>row 2, cell 1</td>
                  <td>row 2, cell 2</td>
            </tr>
      </table> 
      <table>
            <tr>
                  <td>row 1, cell 1</td>
                  <td>row 1, cell 2</td>
            </tr>
            <tr  border="1">
                  <td>row 2, cell 1</td>
                  <td border="1">row 2, cell 2</td>
            </tr>
      </table> 
</body>
</html>
Apr 25 10

Hi-Fi Burn in and Demagnetisation

by Alex Peck

I purchased a second hand copy of the IsoTek Full System Enhancer CD. My aim? To debunk the incredulous claims that this CD can improve the fidelity of my Hi-Fi system. 6moons, amongst others, have given favourable reviews, which made me curious.

The sleeve states this disc can be used for two things: burning in components and demagnetising the whole sytem. The track listing is as follows:

  1. Full system burn-in & demagnetisation.
  2. Full system burn-in & demagnetisation with low level tones.
  3. Full system rejuvenation, including demagnetisation tones.

Burn In

To be honest, burn in seemed plausible to me. In particular, it seems intuitative that speakers may be “burned in” because they have moving parts which might loosen up over time. I have burned in speakers and been able to discern a change in their sound.

I have also observed that my system sounds better if left powered on. I can’t remember ever realising that the sound of a cable had changed over time, however. Perhaps because I’m not in the habit listening analytically to my system. I generally prefer listening to music.

I hadn’t previously thought about the scientific explanation for burning in electronic components (or cables for that matter), but I now suspect that it relates to a stabilisation of the intrinsic magnetic state of the conductors. I’ll return to this later.

Demagnetisation

I was extremely skeptical that playing a CD could somehow affect the state of my HiFi such that later reproduction of music would be audibly better. Then I played track 2 of the Full System Enhancer. It sounds like white noise with some clicks and pops, and the occaisonal continuous tone.

Spectral analysis of Track 2 (via CoolEdit Pro)

The result: a subtle improvement. Whilst I try to avoid Hi-Fi bollocks where ever possible, a tenuous description is required at this point. To expand on subtle improvement, I would say that the sound was slightly more realistic in terms of ambience and timbre. This seemed more apparent with live recordings.

Is it really better? Could it really happen?

I’m tempted to say yes, but I have doubts about anything I don’t fully understand. I therefore decided to try to explain what might be happening by applying some basic physics. A quick scan of Wikipedia (which I assume is acurate enough for the purposes of this discussion) highlighted the following pertinent facts:

  • Conductors have atoms with free electrons. The free electrons move around randomly until an electric force is applied, at which point they flow in the direction of the force as a current.
  • When a material is magnetized, the free electrons remain bound to their respective atoms, but behave as if they were orbiting the nucleus in a particular direction, creating a microscopic current.
  • All materials can be magnetised to an extent, even diamagnetic metals such as copper, silver or lead (diamagnetic materials have a relative magnetic permeability of less than one).

From that, I inferred that my Hi-Fi could be magnetised (though perhaps not very much because it is primarily composed of diamagnetic materials). Clearly, when magnetised, a conductor’s ability to transmit an electric current will be affected. Audio signals are transmitted as a voltage in analogue form (everything after the DAC). Therefore, it is possible that there would be an audible difference between the magnetised and demagnetised state.

I know that a magnetic field can induce an electric current, and vice versa. So it seems that there exists a mechanism for a Hi-Fi system to become magnetised through normal use, and demagnetised using a magic sequence of audio signals.

In conclusion, this process seems plausible to me. To truly understand what is happening requires a firmer grasp on the laws of electromagnetism.

Apr 15 10

Pixels

by Alex Peck

Perhaps this should have been called Voxels, but its good nevertheless.

Apr 8 10

Crysis 2 Teaser

by Alex Peck

This doesn’t give too much away. Hopefully by the time it comes out I’ll have a PC fast enough to play Crysis 1.

Mar 4 10

Filtering sqlmetal output using XSLT

by Alex Peck

I needed to generate LINQ to SQL O/RM classes for a subset of the objects in a large database. Unfortunately, sqlmetal doesn’t provide a mechanism to filter it’s output, so I made my own using XSLT, a batch file and a custom build target. This post explains how.

Overview

When you invoke a build, a custom “BeforeBuild” build target runs to do the code generation. This code generation is driven by a batch file, this is what it does:

  1. Run sqlmetal to output a dbml file for the entire database schema
  2. Using an XML configuration file and an XSLT transform, generate a test .cs file containing a reference to each expected table. When included in our project, this gives us a compile time test that we generated the expected classes (we make this internal).
  3. Run a second XSLT transform based on the configuration file and the output a dbml file from step 1. In this step we prune the xml according to the config file. The result is a dbml file containing only the tables we specify in our configuration.
  4. Run sqlmetal again using the result of step 3 as input. This time sqlmetal outputs our CSharp code for our Linq to Sql classes.

The XSLT transforms are run from the command line using MSXSL.exe, which is available here.

The generated CSharp files are part of my Visual Studio project. After the BeforeBuild target is run (which does the code generation), the generated code is compiled into an Assembly.

Custom build target

These are the pertinent parts of my project file. You can see that inside my project directory, I created a directory called Prebuild where the work happens. This is where the Configuration.xml, DbmlPruner.xslt, GenerateTestTypes.xslt and generate.bat files live.

<?xml version="1.0" encoding="utf-8"?>
<Project ToolsVersion="3.5" DefaultTargets="Build" xmlns="http://schemas.microsoft.com/developer/msbuild/2003">
  <!-- most of the project file omitted for brevity -->
  <Target Name="BeforeBuild" DependsOnTargets="GenerateDbClasses;"> </Target>
  <Target Name="GenerateDbClasses" Inputs="Prebuild\Configuration.xml;Prebuild\DbmlPruner.xslt;Prebuild\GenerateTestTypes.xslt" Outputs="MasterDb.Generated.cs;TestMasterDbTypes.cs">
    <Exec Command="$(ProjectDir)Prebuild\generate.bat $(ProjectDir) $(TargetName) master.dbml prunedmaster.dbml TestMasterDbTypes.cs MasterDb Configuration.xml master" />
  </Target>
</Project>

Configuration.xml

<?xml version="1.0" encoding="utf-8"?>
<!-- This configuration is used to specify which tables should generate linq to SQL classes, 
  and to generate a sanity check class which verifies all the expected types exist in the 
  generated code.
  -->
<Configuration Name="Master Tables" SourceXml="master.dbml" Namespace="Master.Data.Linq" TestClassName="TestMasterTypes">
  <Table SqlName="dbo.TestTable" DataContextPropertyName="TestTable" ClassName="TestTableRow"/>
</Configuration>

GenerateTestTypes.xslt

<?xml version="1.0" encoding="utf-8"?>
<!-- ===========================================================
  Generate a C# class with members corresponding to all the linq
  to SQL tables specified in the input file.
================================================================ -->
<xsl:stylesheet xmlns:xsl="http://www.w3.org/1999/XSL/Transform" version="1.0">
  <xsl:output method="text" />
  <xsl:template match="/">
//-----------------------------------------------------------------------
// <![CDATA[<auto-generated>]]>
//     This code was generated by a tool.
//
//     Changes to this file may cause incorrect behavior and will be lost 
//     if the code is regenerated.
// <![CDATA[</auto-generated>]]>
//-----------------------------------------------------------------------
 
// Disable warning CS0169: The private field 'foo' is never used. This is 
// by design.
#pragma warning disable 0169
 
namespace <xsl:value-of select="/Configuration/@Namespace" />
{
    /// <![CDATA[<summary>]]>
    /// This class is provided as a compile time test for the linq to SQL
    /// classes specified in Prebuild/configuration.xml. It will fail to
    /// compile if one of the dependent classes is not generated (or not
    /// generated with the expected name). This is by design.
    /// <![CDATA[</summary>]]>
    internal class <xsl:value-of select="/Configuration/@TestClassName" />
    {      
<xsl:for-each select="/Configuration/Table"><xsl:text>&#9;</xsl:text><xsl:text>&#9;</xsl:text>private <xsl:value-of select="@ClassName" /> <xsl:value-of select="' '" /> <xsl:value-of select="concat(@ClassName, 'Member')" />;<xsl:text>&#xa;</xsl:text></xsl:for-each><xsl:text>&#9;</xsl:text>}
}
 
#pragma warning restore 0169
  </xsl:template>
</xsl:stylesheet>

DbmlPruner.xslt

<?xml version="1.0" encoding="utf-8"?>
<!-- ===========================================================
  Replicate a .dbml file based on the tables specified in a 
  configuration.
================================================================ -->
<xsl:stylesheet xmlns:xsl="http://www.w3.org/1999/XSL/Transform"
	 xmlns:sql="http://schemas.microsoft.com/linqtosql/dbml/2007"
     version="1.0" >
  	<xsl:output method="xml" version="1.0" encoding="utf-8" indent="yes"
  	  omit-xml-declaration = "no"/>
 
	<xsl:template match="Configuration">
 
		<!-- load sql metal output as $database -->
		<xsl:variable name="database" select="document(@SourceXml)"/>
    <xsl:variable name="namespace" select="namespace-uri($database/sql:Database)"/>
 
		<xsl:comment> =====================================================================================
  <xsl:value-of select="@Name" /> generated from database <xsl:value-of select="$database/sql:Database/@Name" /> (<xsl:value-of select="@SourceXml" />)
========================================================================================== </xsl:comment>
 
		<!-- Output a tree which replicates @SourceXml but contains only the table nodes in the configuration -->
		<xsl:element name="Database" namespace="{$namespace}">
			<xsl:attribute name="Name">
        		<xsl:value-of select="$database/sql:Database/@Name" />
      		</xsl:attribute>
 
			<xsl:for-each select="/Configuration/Table">
 
				<xsl:variable name="sqlName" select="@SqlName"/>
 
				<!-- only output a table element when the source table exists -->
				<xsl:if test="$database/sql:Database/sql:Table[@Name=$sqlName]">
 
					<!-- Output the table substituting Member and Type for the ClassName in the configuration -->
					<xsl:element name="Table" namespace="{$namespace}">
						<xsl:attribute name="Name">
							<xsl:value-of select="@SqlName"/>
						</xsl:attribute>
						<xsl:attribute name="Member">
							<xsl:value-of select="@DataContextPropertyName"/>
						</xsl:attribute>
 
						<xsl:element name="Type" namespace="{$namespace}">
							<xsl:attribute name="Name">
								<xsl:value-of select="@ClassName"/>
							</xsl:attribute>
 
							<!-- Copy the children (Columns etc) -->
							<xsl:copy-of select="$database/sql:Database/sql:Table[@Name=$sqlName]/sql:Type/*"/>
						</xsl:element>
 
					</xsl:element>
 
				</xsl:if>
			</xsl:for-each>
		</xsl:element>
	</xsl:template>
 
</xsl:stylesheet>

generate.bat

This is my entire generate.bat file, you can glean the input arguments from the custom build target above.

::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::
:: Generate LINQ to SQL classes based on tables defined in an xml config
::
 
::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::
 
setlocal 
 
set SQLMETAL=<put your path here!>\sdk\Win2008\Bin\sqlmetal.exe
set MSXSL=<put your path here!>\msxsl.exe
 
set PRJDIR=%1%
set PREDIR=%PRJDIR%Prebuild\
 
set NAMESPACE=%2
 
set METALOUT=%PREDIR%%3
set METALIN=%PREDIR%%4
 
set COMPILETESTCLASS=%PRJDIR%%5
set CONTEXTCLASS=%6
set CONTEXTCLASSFILE=%PRJDIR%%CONTEXTCLASS%.Generated.cs
 
set CONFIGPATH=%PREDIR%%7
set DATABASE=%8
 
::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::
 
echo ================ Generation of LINQ to SQL classes started ================
echo Using %CONFIGPATH%
 
echo 1. Building %METALOUT% for entire %DATABASE% database 
%SQLMETAL% /conn:"server=localhost; database=%DATABASE%;Integrated Security=SSPI" /dbml:%METALOUT%
if errorlevel 1 goto :Failed
 
echo 2. Generating %COMPILETESTCLASS% to test linq to sql types were generated correctly at compile time
%MSXSL% %CONFIGPATH% %PREDIR%GenerateTestTypes.xslt -o %COMPILETESTCLASS%
if errorlevel 1 goto :Failed
 
echo 3. Building %METALIN% from %CONFIGPATH%
%MSXSL% %CONFIGPATH% %PREDIR%DbmlPruner.xslt -o %METALIN%
if errorlevel 1 goto :Failed
 
echo 4. Generating %CONTEXTCLASSFILE% using %METALIN%
%SQLMETAL% %METALIN% /code:%CONTEXTCLASSFILE% /language:csharp /context:%CONTEXTCLASS% /namespace:%NAMESPACE% /serialization:Unidirectional
if errorlevel 1 goto :Failed
 
echo ======== Generation of LINQ to SQL classes completed successfully =========
endlocal
goto :EOF
 
::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::
:Failed
 
echo Configuration %CONFIG% FAILED
endlocal
exit /B 1
Mar 1 10

Folding plug

by Alex Peck

This is a really nice design. The voiceover is considerably less inspiring.

Feb 11 10

XSLT Reference Links

by Alex Peck

Over the last couple of days I started using XSLT for the first time. I found these useful links which were enough to get me going.

Feb 8 10

SqueezeServer WinForms Client

by Alex Peck

A couple of days ago I finally got fed up with the SqueezeCenter web interface. It’s not that it’s bad, it’s just that I open a lot of tabs in my browser, then can’t find the SqueezeCenter page when I need it.

A simple winforms Squeeze client.

I thought it would be nice to be able to control SqueezeCenter from something sitting in the taskbar, so over the last couple of days I made a simple .NET client. I’ve got 90% of what I need running, so as a proof of concept it has served its purpose.

Most of my effort was spent on data access and marshalling threads into the UI. Once I finish under the hood I might make a WPF UI layer, which would make it much more presentable.