Friday, 31 December 2010

WCF service reference not creating correct proxy classes

Problem

I had a test project using NUnit to run unit tests for some components of a WCF service. In addition, I decided to include some integration tests that actually called a WCF service running in IIS on the localhost. To run the integration tests I created a service reference (which should generate the proxy classes) some types I had expected to find didn’t appear as proxies. The types that didn’t appear were declared as known types against the WCF service definition and all data contract attributes were correct. 

When I tried to use a type I had expected to find as a proxy class ReSharper was trying to help me out by referring me to the actual classes in the service projects, not from generated proxies. Even stranger, when I looked at the Reference.cs file (which should contain the proxy classes) the proxy classes were indeed missing but I could find declarations for the actual classes from the service projects.

Image1 
To find References.cs – in the Solution Explorer, View All Files and expand the service reference.

Solution

The fact that ReSharper was helping by pointing at the service projects was the clue. Because this was a test project I had included unit tests that created instances of classes from the service projects and as such had included references to those projects. When I added the service reference it looks like the actual service classes were used in preference to creating proxies.

I removed the references to the service projects, commented out code that needed those references and then re-added the service reference. The proxies were generated correctly.

The moral of the story is, don’t add references to the actual service projects as well as service references to the same code. It’s a good idea to run integration tests in a separate project so the references and service references don’t trip over each other.

Friday, 31 December 2010

Thursday, 30 December 2010

Shapefile basics

It looks like I’m going to have to get to grips with shapefiles and GIS data so here’s a post to give me a heads-up on shapefiles.

What is a shapefile?

A shapefile stores nontopological geometry and attribute information for the spatial features in a data set. The geometry for a feature is stored as a shape comprising a set of vector coordinates.

Because shapefiles do not have the processing overhead of a topological data structure, they have advantages over other data sources such as faster drawing speed and edit ability. Shapefiles handle single features that overlap or that are noncontiguous. They also typically require less disk space and are easier to read and write.

Shapefiles can support point, line, and area features. Area features are represented as closed loop, double-digitized polygons. Attributes are held in a dBASE® format file. Each attribute record has a one-to-one relationship with the associated shape record.” *

Note that a geographic element forming part of a shapefile is referred to as a feature.

“A representation of a geographic feature that has both a spatial representation referred to as a "shape" and a set of attributes.” ***

So, key points:

  • The data is nontopographical geometry it faster to search.
  • Shapefiles can support point, line, and area features.

A shapefile is capable of storing a mixture of different shape types but this is prevented by the specification: "All the non-Null shapes in a shapefile are required to be of the same shape type."

Parts of a shapefile

Firstly, a shapefile actually consists of several files, not one, and a shapefile can contain a combination of mandatory and optional files. A shapefile will contain 3 mandatory files:

  • .shp - the shape file containing the feature geometry
  • .shx - the shape index containing a positional index of the feature geometry (facilitates quick searching)
  • .dbf - the attribute file containing attributes for each shape (dBase IV format)

An ESRI shapefile consists of a main file, an index file, and a dBASE table. The main file is a direct access, variable-record-length file in which each record describes a shape with a list of its vertices. In the index file, each record contains the offset of the corresponding main file record from the beginning of the main file. The dBASE table contains feature attributes with one record per feature. The one-to-one relationship between geometry and attributes is based on record number. Attribute records in the dBASE file must be in the same order as records in the main file.” **

There are a number of optional files but I am currently interested in:

  • .prj – the projection file containing the coordinate system and projection information in plain text
  • .sbn and .sbx – both containing the spatial index of the features
  • .shp.xml - metadata in XML format

 

References

* http://www.esri.com/library/whitepapers/pdfs/shapefile.pdf, p.5
** http://www.esri.com/library/whitepapers/pdfs/shapefile.pdf, p.6
*** http://www.esri.com/library/whitepapers/pdfs/shapefile.pdf, p.31

Thursday, 30 December 2010

Viewing contents of the Global Assembly Cache (GAC)

Here’s a quick tip, if you view the GAC using Windows Explorer you get a nice sanitised view; the underlying folders are aggregated and the contents displayed. This can be useful for quickly finding and removing things from the GAC. However, it doesn’t show you where things really are.
The Windows Explorer view is provided by a shell extension and it is possible to disable it but you’ll lose drag and drop installation into the GAC.
To see where things are really living open a DOS prompt and navigating to the GAC directory:
untitled
The directory listing will reveal the following set of folders:
  • GAC – .NET 1.x assemblies
  • GAC_32 – .NET 2.x assemblies built for 32-bit
  • GAC_64 – .NET 2.x assemblies built for 64-bit (only of 64-bit machines)
  • GAC_MSIL – .NET 2.x assemblies that are architecture independent (32 or 64-bit)
  • NativeImages_Vx_Architecture – Each of these folders is used for storing native images of .NET assemblies that have been pre-compiled using NGen (i.e. no need for JIT). There can be a number of these folders for different .Net versions and compilation targets (e.g. 32 or 64-bit)
  • temp, tmp – Temporary folders
You can then navigate to individual folders and list the contents.

Tuesday, 30 November 2010

Useful tools

I always seem to be setting up development machines and need to install the same set of tools over and over. To help me keep track of them I’m starting a list:

Tool name Description Location
ILSpy ILSpy is the open-source .NET assembly browser and decompiler. A replacement for Reflector which is now a commercial product. http://wiki.sharpdevelop.net/ilspy.ashx
Regulator An advanced, free regular expressions testing and learning tool. http://www.osherove.com/tools
XMLPad XML Notepad 2007 provides a simple intuitive user interface for browsing and editing XML documents. http://www.microsoft.com/downloads/en/details.aspx?familyid=72d6aa49-787d-4118-ba5f-4f30fe913628&displaylang=en
Notepad++ Notepad++ is a free source code editor and Notepad replacement that supports several languages. Running in the MS Windows environment, its use is governed by GPL License. http://notepad-plus-plus.org/
Baretail A real-time log file monitoring tool. http://www.baremetalsoft.com/baretail/
Gallio The Gallio Automation Platform is an open, extensible, and neutral system for .NET that provides a common object model, runtime services and tools (such as test runners) that may be leveraged by any number of test frameworks. http://www.gallio.org/
Sandcastle

Documentation compiler for managed class libraries.

http://sandcastle.codeplex.com/releases/view/47665
NDepend NDepend is a Visual Studio tool to manage complex .NET code and achieve high Code Quality. http://www.ndepend.com/
PartCover Open source code coverage tool. https://github.com/sawilde/partcover.net4
WinMerge WinMerge is an Open Source differencing and merging tool for Windows. http://winmerge.org/

Query Express

Query Express is a simple Query Analyzer look-alike, but being small and free it can be run where the SQL Server client tools are not installed or licensed. http://www.albahari.com/queryexpress.aspx
AnjLab SQL Profiler A free SQL Server Express Edition Profiler that provides the most of functionality standard profiler does. http://anjlab.com/en/projects/opensource/sqlprofiler
TreeTrim This is a command line tool that trims your source code tree. It removes debug files, source control bindings, and temporary files. http://code.google.com/p/treetrim/
LogParser Log parser is a powerful, versatile tool that provides universal query access to text-based data such as log files, XML files and CSV files, as well as key data sources on the Windows® operating system such as the Event Log, the Registry, the file system, and Active Directory®. http://www.microsoft.com/downloads/en/details.aspx?FamilyID=890cd06b-abf8-4c25-91b2-f8d975cf8c07&displaylang=en
Lizard GUI A GUI for parsing log files (use with LogParser). http://www.lizard-labs.net/log_parser_lizard.aspx
cURL cURL is a computer software project providing a library and command-line tool for transferring data using various protocols. http://curl.haxx.se/
Wget GNU Wget is a free software package for retrieving files using HTTP, HTTPS and FTP, the most widely-used Internet protocols. It is a non-interactive commandline tool, so it may easily be called from scripts, cron jobs, terminals without X-Windows support, etc. http://www.gnu.org/software/wget/
Console2 Console is a Windows console window enhancement. Console features include: multiple tabs, text editor-like text selection, different background types, alpha and color-key transparency, configurable font, different window styles. http://sourceforge.net/projects/console/
WireShark A network protocol analyser. http://www.wireshark.org/
Fiddler Fiddler is a Web Debugging Proxy which logs all HTTP(S) traffic between your computer and the Internet. http://fiddler2.com/fiddler2/
Tuesday, 30 November 2010

Thursday, 18 November 2010

WCF service throwing immediate timeout

The problem

I was recently writing automated integration tests against a WCF service using NUnit when out of the blue all the tests failed and kept failing. An examination of the exceptions thrown showed that in each case a CommunicationException was being thrown because of a timeout.

What was most perplexing was that the apparent timeouts were being reported immediately but the client configuration was set to defaults:

<system.serviceModel>
        <bindings>
            <wsHttpBinding>
                <binding name="WSHttpBinding_EnquirySubmissionService" closeTimeout="00:01:00"
                    openTimeout="00:01:00" receiveTimeout="00:10:00" sendTimeout="00:01:00"
                    bypassProxyOnLocal="false" transactionFlow="false" hostNameComparisonMode="StrongWildcard"
                    maxBufferPoolSize="524288" maxReceivedMessageSize="65536"
                    messageEncoding="Text" textEncoding="utf-8" useDefaultWebProxy="true"
                    allowCookies="false">
                    <!-- snip -->
                </binding>
            </wsHttpBinding>
        </bindings>
        <!-- snip -->
  </system.serviceModel>
</configuration>

The service was using WsHttpBinding and was hosted by IIS over HTTPS. The service was also secured using TransportWithMessageCredential and UserName credentials on the message (i.e. username and password).

 

The solution

Firstly I enabled tracing on the service by modifying the Web.config file to include the following:

<system.diagnostics> 
    <sources> 
        <source name="System.ServiceModel" switchValue="Information,ActivityTracing" propagateActivity="true"> 
            <listeners> 
                <add name="xml" /> 
            </listeners> 
        </source> 
        <source name="System.ServiceModel.MessageLogging"> 
            <listeners> 
                <add name="xml" /> 
            </listeners> 
        </source> 
    </sources> 
    <sharedListeners> 
        <add initializeData="C:\logs\TracingAndLogging-client.svclog" type="System.Diagnostics.XmlWriterTraceListener" name="xml" /> 
    </sharedListeners> 
    <trace autoflush="true" />
</system.diagnostics>

I then queried the service again to promote a timeout communication exception. I found the trace log contained:

System.ServiceModel.Security.MessageSecurityException: Message security verification failed. ---&amp;gt; System.ComponentModel.Win32Exception: The event log file is full
at System.Diagnostics.EventLog.InternalWriteEvent(UInt32 eventID, UInt16 category, EventLogEntryType type, String[] strings, Byte[] rawData, String currentMachineName) 
at System.Diagnostics.EventLog.WriteEvent(EventInstance instance, Byte[] data, Object[] values) 
at System.Diagnostics.EventLog.WriteEvent(String source, EventInstance instance, Object[] values) 
at System.ServiceModel.Security.SecurityAuditHelper.WriteEventToApplicationLog(EventInstance instance, Object[] parameters) 
at System.ServiceModel.Security.SecurityAuditHelper.WriteMessageAuthenticationSuccessEvent(AuditLogLocation auditLogLocation, Boolean suppressAuditFailure, Message message, Uri serviceUri, String action, String clientIdentity) 
at System.ServiceModel.Security.SecurityProtocol.OnIncomingMessageVerified(Message verifiedMessage) 
at System.ServiceModel.Security.TransportSecurityProtocol.VerifyIncomingMessageCore(Message&amp;amp; message, TimeSpan timeout) 
at System.ServiceModel.Security.TransportSecurityProtocol.VerifyIncomingMessage(Message&amp;amp; message, TimeSpan timeout)

So, the problem actually was that the event log was full! Having emptied the event log everything started working again.

Thursday, 18 November 2010

Monday, 8 November 2010

Compiled help files (.chm) not working

Here’s a quicky that’s bugged me a few times and I can never remember the solution.

Sometimes when you download a .chm file, open it and try to navigate around you get an “address is not valid” error such as the following:

clip_image001

The solution is quick and simple:

  1. Right-click the .chm file and then click Properties.
  2. Click the Unblock button.
  3. The .chm file may now be opened and will work normally.

Sunday, 7 November 2010

When to install assemblies into the GAC

I have recently been working on an established project which makes use of the Global Assembly Cache (GAC) for most of the assemblies in the application. I have found the experience quite frustrating because keeping the assemblies in sync with the code has been tedious. But I realised I wasn’t really sure if using the GAC was a good idea or not so I’ve done a bit of research.

“Each computer where the common language runtime is installed has a machine-wide code cache called the global assembly cache. The global assembly cache stores assemblies specifically designated to be shared by several applications on the computer.

You should share assemblies by installing them into the global assembly cache only when you need to. As a general guideline, keep assembly dependencies private, and locate assemblies in the application directory unless sharing an assembly is explicitly required. In addition, it is not necessary to install assemblies into the global assembly cache to make them accessible to COM interop or unmanaged code.” *

OK, so there’s the first point. Microsoft suggest, “You should share assemblies by installing them into the global assembly cache only when you need to”.

The GAC is useful for deploying assemblies to be shared by a set of applications but I like the possibility of XCOPY deployments. However, deploying an assembly to the GAC is reported to improve its load performance compared to assemblies not located in the GAC. In addition strongly named assemblies are reported to load faster from the GAC because they are verified at the time they are installed rather than at runtime. In effect the .NET framework skips runtime verification for assemblies loaded from the GAC.

Back to Microsoft:

“There are several reasons why you might want to install an assembly into the global assembly cache:

  • Shared location.

Assemblies that should be used by applications can be put in the global assembly cache. For example, if all applications should use an assembly located in the global assembly cache, a version policy statement can be added to the Machine.config file that redirects references to the assembly.

  • File security.

Administrators often protect the systemroot directory using an Access Control List (ACL) to control write and execute access. Because the global assembly cache is installed in the systemroot directory, it inherits that directory's ACL. It is recommended that only users with Administrator privileges be allowed to delete files from the global assembly cache.

  • Side-by-side versioning.

Multiple copies of assemblies with the same name but different version information can be maintained in the global assembly cache.

  • Additional search location.

The common language runtime checks the global assembly cache for an assembly that matches the assembly request before probing or using the codebase information in a configuration file.” **

I find reasons such as side-by-side versioning quite compelling. I for one have worked on projects where various open source tools refer to different versions of other libraries. Getting all of the assemblies to live together in a single application directory can be a challenge. However, it must be noted that earlier in the article quoted above states:

“You should share assemblies by installing them into the global assembly cache only when necessary. As a general guideline, keep assembly dependencies private and locate assemblies in the application directory unless sharing an assembly is explicitly required.” **

* Global Assembly Cache
** Working with Assemblies and the Global Assembly Cache

Sunday, 7 November 2010