Rss Feed
Tweeter button
Facebook button
Technorati button
Reddit button
Myspace button
Linkedin button
Webonews button
Delicious button
Digg button
Flickr button
Stumbleupon button
Newsvine button

Category: Development

Updated error codes for all Validator tools

By , May 12, 2017 12:01 pm

We’ve just updated our documentation for all our Validator tools to include an up to date list of Exit return codes. You may find these useful if you’re running these tools from the command line.

These error codes apply to C++ Bug Validator, C++ Coverage Validator, C++ Memory Validator, C++ Performance Validator, C++ Thread Validator, .Net Coverage Validator, .Net Memory Validator, .Net Performance Validator and VM Validator.

0 All ok
-1 Unknown error. An unexpected error occurred starting the runtime
-2 Application started ok. You should not see this code returned
-3 Application failed to start. E.g. runtime not present, not an executable or injection dll not present
-4 Target application is not an application
-5 Don’t know what format the executable is, cannot process it
-6 Not a 32 bit application
-7 Not a 64 bit application
-8 Using incorrect MSVCR(8|9).DLL that links to CoreDLL.dll (incorrect DLL is from WinCE)
-9 Win16 app cannot start these because we can’t inject into them
-10 Win32 app – not used
-11 Win64 app – not used
-12 .Net application
-13 User bailed out because app not linked to MSVCRT dynamically
-14 Not found in launch history
-15 DLL to inject was not found
-16 Startup directory does not exist
-17 Symbol server directory does not exist
-18 Could not build a command line
-19 No runtime specified, cannot execute script (or Java) (obsolete)
-20 Java arguments are OK – not an error (obsolete)
-21 Java agentlib supplied that is not allowed because Java Bug Validator uses it (obsolete)
-22 Java xrun supplied that is not allowed because Java Bug Validator uses it (obsolete)
-23 Java cp supplied that is not allowed because Java Bug Validator uses it (obsolete)
-24 Java classpath supplied that is not allowed because Java Bug Validator uses it (obsolete)
-25 Firefox is already running, please close it (obsolete)
-26 Lua runtime DLL version is not known (obsolete)
-27 Not compatible software
-28 InjectUsingCreateProcess, no DLL name supplied
-29 InjectUsingCreateProcess, Unable to open PE File when inspecting DLL
-30 InjectUsingCreateProcess, Invalid PE File when inspecting DLL
-31 InjectUsingCreateProcess, No Kernel32 DLL
-32 InjectUsingCreateProcess, NULL VirtualFree() from GetProcAddress
-33 InjectUsingCreateProcess, NULL GetModuleHandleW() from GetModuleHandleW
-34 InjectUsingCreateProcess, NULL LoadLibraryW() from LoadLibraryW
-35 InjectUsingCreateProcess, NULL FreeLibrary() from FreeLibrary
-36 InjectUsingCreateProcess, NULL VirtualProtect() from GetProcAddress
-37 InjectUsingCreateProcess, NULL VirtualFree() from GetProcAddress
-38 InjectUsingCreateProcess, unable to find DLL load address
-39 InjectUsingCreateProcess, unable to write to remote process’s memory
-40 InjectUsingCreateProcess, unable to read remote process’s memory
-41 InjectUsingCreateProcess, unable to resume a thread
-42 UPX compressed – cannot process such executables
-43 Java class not found in CLASSPATH
-44 Failed to launch the 32 bit svlGetProcAddressHelperUtil.exe
-45 Uknown error with svlGetProcAddressHelperUtil.exe
-46 Couldn’t load specified DLL into svlGetProcAddressHelperUtil.exe
-47 Couldn’t find function in the DLL svlGetProcAddressHelperUtil.exe
-48 Missing DLL argument svlGetProcAddressHelperUtil.exe
-49 Missing function argument svlGetProcAddressHelperUtil.exe
-50 Missing svlGetProcAddressHelperUtil.exe
-51 Target process has a manifest that requires elevation
-52 svlInjectIntoProcessHelper_x64.exe not found
-53 svlInjectIntoProcessHelper_x64.exe failed to start
-54 svlInjectIntoProcessHelper_x64.exe failed to return error code
-55 getImageBase() worked ok
-56 ReadFile() failed in getImageBase()
-57 NULL pointer when trying to allocate memory
-58 CreateFile() failed in getImageBase()
-59 ReadProcessMemory() failed in getImageBase()
-60 VirtualQueryEx() failed in getImageBase()
-61 Bad /appName argument in svlInjectIntoProcessHelper_x64.exe
-62 Bad /dllName argument in svlInjectIntoProcessHelper_x64.exe
-63 Bad /procId argument in svlInjectIntoProcessHelper_x64.exe
-64 Failed to OpenProcess in svlInjectIntoProcessHelper_x64.exe
-65 A DLL that the .exe depends upon cannot be found

Working with Dev C++

By , December 14, 2016 5:57 pm

We’ve had a few people asking how to configure C++ Memory Validator to work with programs built using Dev C++. Dev C++ is an IDE used for developing programs with the MingW compiler.

We tested using this download of Dev C++.

Debug information

Any program built using the default Dev C++ settings will generate a binary image that contains debugging information that is in a format that our tools cannot read. The MingW compiler can create debug information in many formats, including COFF and STABS, both of which our tools support. You can turn these debugging formats on using the -gCoff and -gstabs flags. We recommend using STABS symbols.

Configuring Dev C++

Open the Project Options… dialog from the Project menu.

Choose the Parameters tab. Add the option -gstabs to all three columns. Click OK.

Now that you have configured the debug options all you need to do is to rebuild your project to ensure the debug information is present.


The correct way to determine if a file is a directory.

By , November 30, 2016 1:41 pm

After writing for Microsoft’s Windows platform for 20 years I thought I knew all I could know about GetFileAttributes(). Until I found a rather odd and subtle bug in some code that interacted with data supplied by the user of the software. A call would succeed that I expected to fail. Naturally this meant the software didn’t make the right choices and instead of being presented with an helpful dialog explaining what had failed, the software sat silently in a corner humming to itself waiting for the user to work out what had happened. The failure was that I was presenting incorrect data to GetFileAttributes() assuming that it would always fail for bad input. How wrong I was!

I thought I’d write up what can go wrong with GetFileAttributes().

It’s tempting to test if a file is a directory by writing code like this:

if ((GetFileAttributes(fileName) & FILE_ATTRIBUTE_DIRECTORY) != 0)
    // file is a directory

The above looks logically correct. But there are problems with it.

First, a refresher on file attribute values…

File Attributes

The list of defined file attributes is in WinNT.h. The values are shown below.

#define FILE_ATTRIBUTE_READONLY             0x00000001  
#define FILE_ATTRIBUTE_HIDDEN               0x00000002  
#define FILE_ATTRIBUTE_SYSTEM               0x00000004  
#define FILE_ATTRIBUTE_DIRECTORY            0x00000010  
#define FILE_ATTRIBUTE_ARCHIVE              0x00000020  
#define FILE_ATTRIBUTE_DEVICE               0x00000040  
#define FILE_ATTRIBUTE_NORMAL               0x00000080  
#define FILE_ATTRIBUTE_TEMPORARY            0x00000100  
#define FILE_ATTRIBUTE_SPARSE_FILE          0x00000200  
#define FILE_ATTRIBUTE_REPARSE_POINT        0x00000400  
#define FILE_ATTRIBUTE_COMPRESSED           0x00000800  
#define FILE_ATTRIBUTE_OFFLINE              0x00001000  
#define FILE_ATTRIBUTE_ENCRYPTED            0x00004000  
#define FILE_ATTRIBUTE_VIRTUAL              0x00010000  

Rather strangely, the invalid attributes flag is defined in a different file, WinBase.h.


Problem 1

What if GetFileAttributes() fails? If the file doesn’t exist, the call fails. If the filename specifies a computer name, the call fails. See GetFileAttributes() documentation for more informtion. When GetFileAttributes() fails it returns INVALID_FILE_ATTRIBUTES. This error status passes the above test. OK, so add an additional check and the code becomes

DWORD attribs;

attribs = GetFileAttributes(fileName);
if ((attribs != INVALID_FILE_ATTRIBUTES) &&
    ((attribs & FILE_ATTRIBUTE_DIRECTORY) != 0))
    // file is a directory

Problem 2

Even with the above file-does-not-exist problem solved there is another problem. The file could be a directory, but it could be a directory that you don’t want. For example what if you’ve allowed the user to specify the directory name and they typed _T(“/”), or what if your filename creation code has a bug in it that fails when passed an empty name, resulting in a calculated filename of _T(“\”). What then?

In these cases the following calls all return 0x16.


0x16 means hidden (0x02), system (0x04), directory (0x10).

It’s a reasonable bet that in your code, any code looking for a directory to use is probably not looking for a hidden directory and almost certainly not intending to use a system directory. OK, time for a new implementation.

DWORD attribs;

attribs = GetFileAttributes(fileName);
if ((attribs != INVALID_FILE_ATTRIBUTES) &&         // check if a valid file
    ((attribs & FILE_ATTRIBUTE_DIRECTORY) != 0) &&  // file is a directory
    ((attribs & FILE_ATTRIBUTE_HIDDEN) == 0) &&     // file is not hidden
    ((attribs & FILE_ATTRIBUTE_SYSTEM) == 0))       // file is not system
    // file is a directory that isn't hidden and isn't system

What about files, rather than directories?

It’s natural to think about implementing checks for if a filename identifies a file rather than a directory. You test for this in exactly the same way but looking for different attributes. You’ll want to exclude FILE_ATTRIBUTE_DIRECTORY and then depending on the job your code is doing you’ll want to consider excluding files depending upon the following attributes:


and of course, you might also want to consider FILE_ATTRIBUTE_HIDDEN and FILE_ATTRIBUTE_SYSTEM.

Additional reading

Microsoft documentation on GetFileAttributes().

Why is GetFileAttributes the way old-timers test file existence? Old New Thing.


We’ve been quiet for a while, sorry about that.

By , November 28, 2016 3:29 pm


It’s been a while since we posted anything on the blog. If you weren’t a customer, regularly receiving our software update emails you might think we weren’t going anything.

That’s an oversight on our part. We’re hoping to rectify this over the next few months, posting more useful information both here and in the library.

_tempnam and friends

Our most recent update has been to update C++ Memory Validator provide memory tracking for the _tempnam group of functions. These are _tempnam, _tempnam_dbg, _wtempnam, _wtempnam_dbg.

This support is for all supported compilers, from Visual Studio 2015, Visual Studio 2013, Visual Studio 2012, Visual Studio 2010, Visual Studio 2008, Visual Studio 2005, Visual Studio 2003, Visual Studio 2002, Visual Studio 6. Delphi and C++ Builder, Metrowerks compiler, MingW compiler.

.Net support, for the future

Internal versions of C++ Coverage Validator can provide code coverage statistics for .Net (C#, VB.Net, J#, F#, etc) as well as native languages (C++, C, Delphi, Fortran 95, etc).

Internal versions of C++ Performance Validator can provide performance profiling statistics for .Net (C#, VB.Net, J#, F#, etc) as well as native languages (C++, C, Delphi, Fortran 95, etc).

UX improvements

All tools, free and paid, have had the UX for filename and directory editing improved so that if a filename doesn’t exist it is displayed in red and if it does exist it is displayed in it’s normal colour (typically black). See screenshots (from Windows 8.1).

Non existent filename:

Existing filename:


Marmalade game SDK support

By , December 16, 2015 12:02 pm

We’ve recently added support for the Marmalade game SDK to C++ Memory Validator.

This article will show you how to configure a Marmalade project for use with C++ Memory Validator, how to setup C++ Memory Validator for use with Marmalade and how to launch a Marmalade game from C++ Memory Validator.

Configuring your project settings

To work with C++ Memory Validator you need to build the x86 Debug configuration and/or the x86 Release configuration of your Marmalade project using Visual Studio.

These configurations need to be built so that they create debug information and so that a PDB file containing debug information is created. The example projects that ship with Marmalade do not do this – you will need to edit them to make the linker stage create debug information.

Editing the compile stage debug information


Editing the link stage debug information


You must ensure that both compile and link stages have the correct settings set. If only compile or only link is set you will not get debugging symbols.

Debugging symbols are important for two reasons:

  • Without symbols C++ Memory Validator cannot find the Marmalade memory allocators and will not be able to track the Marmalade memory allocations your game makes.
  • Without symbols C++ Memory Validator will not be able to turn callstack addresses into class names, function names, filenames and line numbers.

Configuring C++ Memory Validator

In order to work correctly with Marmalade we need to make sure we’re only going to track the memory allocation your game makes with Marmalade and not any of the work that the Marmalade game engine is doing itself. We need to make a few simple changes to C++ Memory Validator.

  • Open the settings dialog. Click Reset. This will reset all C++ Memory Validator settings to the default.
  • Go to the Collect tab, disable the check boxes in the top two groups of check boxes, then enable the single Marmalade check box. The settings should look like shown below.
  • Click OK.


Launching the game

To launch a Marmalade game with C++ Memory Validator we launch the Marmalade simulator and specify the game to run using the Marmalade -via command line argument.

If Marmalade is installed in c:\marmalade then the path to the simulator is


If an example game (shipped with Marmalade) is found at this location


then the command line is


and the startup directory is


We leave the Application to monitor unchanged. It should have the same value as Application to launch.

This is how the launch dialog looks when you are launching this game.


Click Go! to launch the Marmalade game. C++ Memory Validator will monitor all memory allocations, reallocations and deallocations made via the s3eMalloc, s3eRealloc, s3eFree functions. Run your game as normal, then close your game. Wait for C++ Memory Validator to show you any leaks on the Memory tab. Additional statistics can be views on the Objects, Sizes, Timeline and Hotspots tabs.


How to speed up Memory Validator by changing DbgHelp version

By , September 6, 2015 9:26 am

Recently we’ve had a few customers contact to tell us they have experienced a dramatic reduction in speed when using C++ Memory Validator.

We found this puzzling as we hadn’t really noticed this. We investigate and found that some parts of our code were hitting the disk a bit too much. To address this we implemented a buffered file read/write system so that we hit the disk once rather than many times. For our test case (which was a substantial program being monitored) this worked wonders. Performance improved enormously. Smiles all round.

But our customers still reported problems. This was puzzling. We started logging everything in one particular code path (which we knew was the problem). Nothing obvious. Next step, start timing all the logging. But just before we got to that we did a simple test. We iterated through each version of DbgHelp.dll that C++ Memory Validator can supply – if you remember we let you specify which Visual Studio version you used and we supply a version of DbgHelp.dll that ships and works for that (not all DbgHelp.dll are equal!).

Imagine our surprise when we found that DbgHelp.dll ( shipped prior to Visual Studio 2013 are blazingly fast and the DbgHelp.dll we supply for use with Visual Studio 2013/2015 (6.3.9431.0) are slow. If you’re paying attention you’ll also notice the DbgHelp.dll version number has decreased rather than increased – ask Microsoft, we have no idea why they decreased the version number with more recent releases.

For now, until we can get a new release out to address this anomaly we recommend that you ignore choosing the Visual Studio you are using and deliberately choose Visual Studio 2012. This will select DbgHelp and you should find the usual blazing speeds you are used to are restored.

To change the version of DbgHelp used, open the settings dialog, go to Symbol Lookup then change the version in the combo box. Click OK.

Any problems, as usual, please contact support.


Changes to injection behaviour

By , July 23, 2015 4:16 pm

We’ve just changed how we launch executables and attach to them at launch time. We’ve also changed how we inject into running executables. This blog post outlines the changes and the reasoning behind the changes.

The injected DLL

Microsoft documentation for DllMain() states that only certain functions can be called from DllMain() and you need to be careful about what you call from DllMain(). The details and reasons for this have never been made very explicit, but the executive summary is that you risk getting into a deadlock with the DLL loader lock (a lock over which you, as a programmer, have no control).

Even though this dire warning existed from the first alpha versions of our tools in 1999 until July 2015 we’ve been starting our profilers by making call from DllMain when it receives the DLL_PROCESS_ATTACH notification. We’ve never had any major problems with this, but that’s probably because we just tried to keep things simple. There are some interesting side-benefits of starting your profiler this way – the profiler stub just starts when you load the profiler DLL. You don’t need to call a specific function to start the profiler DLL. This is also the downside, you can’t control if the profiler stub starts profiling the application or not, it always starts.

Launching executables

Up until now the ability for the profiler to auto-start has been a useful attribute. But we needed to change this so that we could control when the profiler stub starts profiling the application into which it is injected. These changes involved the following:

  • Removing the call to start the profiler from DllMain().
  • Adding additional code to the launch profiler CreateProcess() injector to allow a named function to be looked up by GetProcAddress() then called.
  • Changing all calls to the launch process so that the correct named function is identified.
  • Finding all places that load the profiler DLL and modifying them so that they know to call the appropriate named function.

Injecting into running executables

The above mentioned changes also meant that we had to change the code that injects DLLs into running executables. We use the well documented technique of using CreateRemoteThread() to inject our DLLs into the target application. We now needed to add the ability to specify a DLL name, a function name, LoadLibrary() function address and GetProcAddress() function address and error handling into our dynamically injected code that can be injected into a 32 bit or 64 bit application from our tools.

Performance change

A useful side effect of this change from DllMain() auto-start to the start function being called after the DLL has loaded is that thread creation happens differently.

When the profiler stub starts via a call to the start profiler from DllMain() any threads created with CreateThread()/beginthread()/beginthreadex() all wait until the calling DllMain() call terminates before these threads start running. You can create the threads and get valid thread handles, etc but they don’t start working until the DllMain() returns. This is to part of Microsoft’s own don’t-cause-a-DLL-loader-lock-deadlock protection scheme. This means our threads that communicate data to the profiler GUI don’t run until the instrumentation process of the profiler is complete (because it all happens from a call inside DllMain()).

But now we’ve changed to calling the start profiler function from after the LoadLibrary() call all the threads start pretty much when we create them. This means that all data comms with the GUI get up to speed immediately and start sending data even as the instrumentation proceeds. This means that the new arrangement gets data to the GUI faster and the user of the software starts receiving actionable data in quicker than the old arrangement.

UX changes

In doing this work we noticed some inconsistencies with some of our tools (Coverage Validator and Thread Validator for instance) when working with an elevated Validator and non-elevated target application if we were injecting into the running target application. The shared memory used by these tools to communicate important profiling data wasn’t accessible due to permissions conflicts between the two processes. This was a problem because we were insisting the the Validator should be elevated prior to performing an injection into any process.


A bit of experimentation showed that under the new injection regime described above that we didn’t need to elevate the Validator to succeed at injecting into the target non-elevated application. It seems that you get the best results when the target application and the Validator require the same elevation level. This is also important for working with services as they tend to run with elevated permissions these days – but injecting into services is always problematic due to different security regimes for services.

This insight allowed us to remove the previously mandatory "Restart with administrator privileges" dialog and move any potential request to elevate privileges into the inject wizard and inject dialog. In this article I will describe the inject dialog, the changes to the inject wizard are similar with minor changes to accommodate the difference between dialog and wizard.


Depending upon Operating System and the version of the software there are two columns that may or may not be displayed on the inject dialog. The display can be sorted by all columns.

Elevation status

When running on Windows Vista or any more recent operating system the inject dialog will display an additional column titled Admin. If a process is running with elevated permissons this column will display an administrator permissions badge indicating the elevation may be required to inject into this process.

Processor architecture

When running 64 bit versions of our tools an additional column, titled Arch will be added to the inject dialog. This column indicates if the process is a 32 bit process or 64 bit process. We could have added a control to allow only 32 bit or 64 bit processes to be displayed but our thinking is that examining this column will be something is only done for confirmation if the user of our tools is working on both 32 bit and 64 bit versions of their tools. As such having to find the process selector and select that you are interested in 32 bit tools is overhead the user probably doesn’t need.


Changes to how we build our software

By , January 4, 2015 8:25 pm

Starting with our first software release of 2015, we will start shipping software that uses the Visual Studio 2010 C runtime and MFC libraries. The purpose of this article is to explain why we’re making this change and how this will affect users of our software.

Some of our past choices have been about technology and some about ease of doing the work. If the operating system or development environment fights you, you are less likely to use it.

Compiler History

When we started the company the current version of Visual Studio was Visual Studio 6.0. So naturally our software shipped with C runtime and MFC libraries for Visual Studio 6.0.

We didn’t really like the direction the Visual Studio team went in with subsequent releases,in particular with some subtle changes to the debugger that, it seems, many people have never noticed. These changes removed a particular feature from the debugger that we find very useful (drag an address to the disassembly view and automatically get the disassembly displayed – the workaround to do this in current versions of Visual Studio is clumsy, error prone and occasionally fails). Because of this we continued using Visual Studio 6.0 to build our software. Anything requiring support for more recent versions of Visual Studio would be built using that version of Visual Studio (library stubs, Visual Studio editor support, etc).

When we started exploring 64 bit support we started with Visual Studio 2008 because that was the current toolset at the time. Our 64 bit tools ship with the Visual Studio 2008 C runtime and MFC libraries.

For our .Net tools we’d be using Visual Studio 6.0 for the GUI, but Visual Studio 2010 (or 2012) for the profiler backend.

This resulted in our routinely using multiple versions of Visual Studio to do development.

OS History

Historically we have supported all operating systems from Windows NT 4 onwards. We have always wanted to support our customers using Windows NT 4 and Windows 2000. Many people using Windows embedded are still on old versions of Windows.

We made the rather unusual decision to develop on Windows XP x64 but to test on Windows 7 and Windows 8. This decision was mainly due to UX blunders Microsoft had made with the Start Menu on both Windows 7 (no fly out start menu) and 8 (no start menu!), and also to the file search functionality which is very easy to use on XP x64 and very hard to use from Windows Vista onwards.

These may seem like trivial things but we use file search a lot and often the Visual Studio file search is not the tool you need. Simply put we found XP much, much easier to be productive with compared to anything that came after.

Reason for change

Since April 2014, Microsoft no longer supports Windows XP. From a security standpoint this isn’t good. For us, or for you (if we get compromised, how can you rely on us?). As such we had to move away from Windows XP x64 whether we wanted to or not. We’ve written some tools to allow us to do fast easy searching without needing to go near the UX disaster that is the current version of Windows search. The removes one of our main objections to working on Windows 7 or Windows 8 on a daily basis.

We also tested a lot of start menu replacements. We finally settled on Classic Shell as this not only provides a very useful start menu on Windows 8 but also allows you to have a proper fly out menu (like XP) rather than the cramped and very awkward to use compressed menu you get with Windows Vista/7 (no wonder Microsoft’s UX metrics showed start menu use was down – they’d made it too hard to use).

So that’s the two UX reasons out of the way, what about the software?

We would also like to take the software tools in new directions – to support .Net alongside C++ rather than in separate tools. This can be done using our current arrangement but it’s harder than necessary. In particular it makes automating our builds harder. Visual Studio 2010 is a lot easier to control from the command line that Visual Studio 6.0. So moving to one Visual Studio for this is a major bonus. Also, debugging software built using multiple different versions of Visual Studio, some of which can’t read debug info from other versions of Visual Studio, that is horrible.

We noticed that no one has discussed Windows NT 4 or Windows 2000 with us for quite some time so we decided to drop support for Windows NT 4 and Windows 2000. Removing this constraint meant we could move to the Visual Studio 2010 runtime and thus use one main version of Visual Studio to do most development work.

The reasons the Visual Studio 2010 runtime is important is because this is the most recent runtime that supports Windows XP and many people are still using Window XP.


An unfortunate consequence of this change to Visual Studio 2010 runtimes is that if you are using more than one of our 32 bit tools or more than one of our 64 bit tools then you will need to install new versions for all of the 32 bit tools (or all of the 64 bit tools). The reason for this is due to DLL dependencies and memory allocated by DLLs all needing to come from the same allocator. Mixing VS 2010 built DLLs with VS 2008 built DLLs or VS 6 built DLLs will give unspecified results.

We realise this is inconvenient and certainly not ideal. We won’t be changing the runtime we use again until none of our customers need support for Windows XP. We anticipate people using XP for many years into the future (based on comments made to us by customers).

The Future

We aim to roll all the functionality of our .Net tools into our C++ tools. C++ Coverage Validator will be able to collect coverage data for .Net code, C++ Memory Validator will be able to collect and analyse .Net memory data, C++ Performance Validator will be able to profile .Net code. The .Net specific tools will then be phased out.

If you are using our .Net tools, you will still be supported – we’ll move your .Net licences to the C++ licences where you will get the .Net data previously provided by the .Net tools.

If you are using our C++ tools, you will get additional .Net data in the tool.

The intention is that the tools will simply get better. We aren’t going to damage the C++ side of the tools to provide the .Net support. I mention this because we’ve had some questions from customers asking about this (having seen what some of our competitors did when they decided to support .Net).

Visual Studio

We will continue to support every version of Visual Studio from Visual Studio 6 through Visual Studio 2013 (and what comes after it, Visual Studio 2014 CTP, now renamed Visual Studio 2015 CTP) and future versions.

Windows NT 4, Windows 2000 support

If you need support for Windows NT 4 or Windows 2000, please contact us. We will be able to support these operating systems, but only via a special build facility.


Visual Studio 2014 CTP

By , August 12, 2014 3:54 pm

A few months ago Microsoft released the Community Technology Preview of Visual Studio 2014, also known as Visual Studio 2014 CTP.


Because this is a community technology preview it only runs on Windows 8. Microsoft recommend that you do not install this on any machine that is important to you. This caused me some frustration. I had any number of Windows 7 machines I could have put this on, but Windows 8 machines – the ones we have are being used for important tasks. This meant creating a VM I could use. I spent probably the best part of 2 weeks futzing around before I finally got a Windows 8 VM working that would succeed in installing Visual Studio 2014 CTP. I tried creating virtual machines from other virtual machines, creating virtual machines from existing Windows 8 machines. All failed.

The only thing that succeeded was with a brand new Windows 8 install from DVD followed by installing Visual Studio 2014 CTP. And even then for the install to succeed it had to be a web install. The download and .iso, burn it and install from that always failed (just like for 2012 and 2013).

As with 2012 and 2013 before the install process is horrible and un-informative. Gone are the days of an install dialog that tells you what it’s doing and that has a useful progress bar. Now we have these “I’m doing something bars” which tell you nothing about how far through you are and only serve to tell you that the thread that is running the animation hasn’t crashed. I hate these things. I really do. It’s an awful user experience.

The install dialog does tell you what it’s doing at each stage – but nothing while it’s doing it, so you have no idea it’s progressing or has hung. I’ve had so many failures installing 2012, 2013 and 2014 that I don’t trust the installer. And trying to uninstall any of them, that always fails. I know 2014 CTP is a preview but 2012 and 2013, they are released software.

User experience

This is the next version of Visual Studio, following on from 2012 and 2013 and continuing with the same theme, that very toned down, hard to use, monotone look, although you can choose the “blue” theme. Why they do this I have no idea. Apple excel at UX and at one time it appeared Microsoft did, but now it’s make everything as hard to use as possible. Colours on icons add information, so don’t take that away! (*) They seem to have got the message on mixed case text though. ALL CAPS is gone.

* I gave up with my Windows Phone because of the icons, it was a great phone except for that crucial bit of UX – I was continually guessing at what the icons meant, I could never be sure – no ease of use.

Version Number

The version number for Visual Studio 2014 CTP is 14.0. This is a change to the natural sequence which would have been 13.0.

This means the C Runtime and MFC dlls end with 140.dll, 140u.dll, 140ud.dll, etc.

What’s new? C Runtime DLL

If you’re a C/C++ developer the big news is that msvcrNNN.dll is gone. The old naming convention for the Microsoft C runtime has been done away with.

The new C runtime DLL is appcrt140.dll (and appcrt140d.dll for debug builds).

Other DLLs that ship with it are desktopcrt140.dll, msvcp140.dll, vccorlib140.dll and vcruntime140.dll

Full list of candidate redist DLLs:

  • appcrt140.dll
  • desktopcrt140.dll
  • msvcp140.dll
  • vccrlib140.dll
  • vcruntime140.dll
  • vcamp140.dll
  • mfc140u.dll
  • mfcm140u.dll
  • mfc140chs.dll
  • mfc140cht.dll
  • mfc140deu.dll
  • mfc140esn.dll
  • mfc140fra.dll
  • mfc140ita.dll
  • mfc140jpn.dll
  • mfc140kor.dll
  • mfc140rus.dll
  • vcomp140.dll

What’s new? DTE

If part of your work involves getting the Visual Studio IDE to do anything you want, such as opening source files for editing then you’ll be working with the Visual Studio DTE. The new DTE number skips a version and jumps from 12 to 14. So you’ll want:


Debug memory guard structure, x86

The debug implementation of the C runtime heap uses a linked list with helpful debug information and two guard regions before the debug header and after the allocated data. The helpful debug information sits in the debug header with the linked list pointers. Beginning with the introduction of the 64 bit support this header was modified to swap two data members (size and blockUse) around to improve the memory usage for 64 bit systems (alignment on 8 byte boundaries). This was handled via conditional compilation so that the data members remained in the original order for use on 32 bit systems.

That conditional compilation element has gone! This now means code that inspects the debug heap for 32 bit systems needs to know if it’s working with a heap layout that pre dates Visual Studio 2014 CTP or is from Visual Studio 2014 CTP. Failure to understand this heap layout change will likely result in code that inspects the heap and reports incorrect block sizes and/or corrupt data blocks when the data blocks are not corrupt.

This is a serious change. It’s also an obvious step to take. Visual Studio 2014 CTP cannot read debug symbols created with Visual Studio 6. This layout change also puts paid to debug heap support from that era. Along with dropping (or trying to drop!) support for Windows XP this is another sign that although many people are still using older operating systems (*) (and compilers) Microsoft is sending a sign that they really do want to drop the older way of doing things.

(*) Every now and then we field support questions asking if we still support Windows 2000 (yes), Windows XP Embedded (yes) or Windows CE (no).

Low level detail

The compiler continues to create ever more optimized code. As with some of the Windows 7 service pack releases we’ve noticed some optimized code sequences that do things differently to before. Visual Studio 2014 CTP doesn’t ship with source or symbols to APPCRT140.DLL (although you can get the latter from Microsoft’s symbol servers) so it’s hard to tell what’s going on inside the new C Runtime. But it’s clear it’s a new architecture and way of doing things. Many functions that once would have been called by linking to them and the call being redirected through the Import Address Table are now passed to a lookup helper function that does some sort of encryption, calls GetProcAddress, does more encryption and then passes the function back to be called. Why do I mention encryption? Because the function names hint at that. It’s quite a change from how things used to be done. Why it’s being done I can’t say, we don’t have the source to examine and I haven’t tried to reverse engineer the functions. These are just comments about things I noticed while I was investigating some unexpected behaviour as we were adding support for Visual Studio 2014 CTP to our C++ tools.

Updated – A day later!

Just after we published this article James McNellis from the Microsoft Visual C++ Team Blog contacted us to let us know a few things.

  • Apparently source code for the C Runtime is available but the reason you don’t see it in the debugger is because the symbols files on Microsoft’s symbol servers have been stripped – they only have function names but not filenames and no line numbers.
  • A solution to this is to build your C/C++ application linked statically to the C Runtime. This gives you symbols to examine the C Runtime. We didn’t notice this because the only occasions when we ran into any problems with our ports was working with code dynamically linked to the C Runtime.
  • Two articles detailing why the changes have been made have been posted to the Visual C++ Team Blog. These articles are worth a read and show that their thinking is looking forward many years into the future, mainly with an eye on improving things for developers and security issues. These articles are:

When you understand the refactoring and the desire for a wider platform support you can understand the reason for looking up functions by GetProcAddress() and calling the result rather than linking to them. Thanks to James for reaching out and letting us know their thinking.

From the above articles the standout thing for me is that most people will (for the short term at least) want to compile with _CRT_STDIO_LEGACY_WIDE_SPECIFIERS defined so that <tchar.h> continues to provide the string functions your code expects and not the C99 standard implemented in Visual Studio 2014.


An End to Unwanted UAC prompts

By , May 25, 2014 7:29 am

We have just updated all of our software tools so that they no longer require administrator privileges to run. No more unwanted User Account Control prompts!

I’m going to explain why we used to require administrator privileges, why we’ve removed that requirement and what impacts that has for you the user of the software.

The past: Our software tools up until 24 May 2014


If you’ve ever used our tools on Windows Vista, Windows 7 or Windows 8 you’ll know that our tools required Administrator privileges to run.

Because of this each time you start the software you are faced with a User Account Control dialog. There is a pause, the screen darkens and then the User Account Control dialog is displayed. You can’t do anything other than interact with this dialog. It’s a “Yes I want to run this. No I don’t want to run this.” deal. But of course you want to run this software, you just started it. Hal really should open the pod bay doors.

This gets even more frustrating if you’ve used our command line interface to automate your testing or if you need the software under test to run without administrator privileges.


The ideal scenario would be for the software to run without requiring administrator privileges, just like most applications on your computer. This would improve the usability of the software, make automated testing smoother and just be better all around.

Reasons for change

When our tools require administrator rights to run there are multiple consequences of that requirement:


  1. A User Account Control dialog is displayed, interrupting the user’s flow.
  2. Automated tests require someone present to approve each tests’s run because a User Account Control dialog is displayed. This either partially or completely defeats the purpose of automating the tests.
  3. The software under test is now run with Administrator privileges. For most applications that isn’t an issue but for some applications this is not the correct privilege level for that application to run at.

The first issue isn’t ideal and adds frustration to the user’s life.

The second issue is horrible.

The third issue is a deal breaker for the few people that must test their application’s at a specific privilege level.

As you can see we had to change this state of affairs.

Working without Administrator Privileges

When running the software without administrator privileges the only thing you’ll notice is no administrator privileges. You can launch and monitor applications without administrator privileges. You can also monitor services that are linked to the NT Service API associated with the tool without administrator privileges. Only the Inject into Application and Wait For Application functionality (C++ tools only) prompt you for administrator privileges. Other than that the software works the same as usual.


If you run the software without administrator privileges the software will communicate with SVLAdminServiceX86 or SVLAdminServiceX64 as appropriate. The service will do the work required.

The C++ tools now work with desktop application, services running on LocalSystem account and services running on LocalService account.


You can still right click the software and run the software with administrator privileges. If you choose this the software will behave exactly as it did in the previous release, it will do all the actions itself and not ask the SVL Admin Service to do the work.

If you want to Inject into a running process or Wait for an application to start we can still do that with the C++ tools but you will need to run with administrator privileges to do that. This is principally because CreateRemoteThread from a service doesn’t work when the target application is not running in the same Windows session. This is a Windows security improvement. Until we can provide a workaround for this these two activities require administrator privileges. The software will automatically prompt you for this process elevation.

Similarly if you launch an application using CreateProcess (the recommended method) you don’t need administrator privileges. But if you choose any of the other options you will be prompted for administrator privileges.

Why did we require administrator privileges?

Our software tools are dynamic analysis tools that analyse the behaviour of software at runtime. As such our tools use a variety of techniques to invade the user space of the software under test and to communicate the collected data back to the user interface. If the software is a desktop application then different security environments are in force compared to if the software is a service. When dealing with a service such as Internet Information Server then you are dealing with a very locked down process. Some of these processes are very hard to interact with. You may be able to inject into the process but then not communicate with the user interface.

To cope with this early versions of our software simply used global shared memory (prefix the shared memory name with \Global) and ran with appropriate debug security privileges. This worked really well. Then came Vista with the new security regime. I’m sorry to say that our initial response was lazy. We simply put admin privileges on everything as a temporary workaround until we could get around to fixing things so they’d work without admin privileges.

So what happened? We got side tracked. We spent a lot of time focussing on other things, supporting different operating systems, different Visual Studio, porting the software to 64 bit, floating licences, improving the UX of tools like C++ Coverage Validator. And because we were at the time developing on Windows XP x64 we didn’t feel the pain of using the new security regime all the time. We’d use Windows 7 for our email machines, personal laptops and for testing but not for daily development.

Why were we using Windows XP x64? The answer is simple. The compressed start menu on Windows 7 is harder to use than the flyout menu on Windows XP. The search functionality on Windows XP is so much easier to use than that built into Windows 7, especially if you are searching for words inside a document (think Visual Studio project, manifest, source code, etc). Yes I know Windows 7 search has great power built into it, but it’s slow, hard to use as it’s crammed into one tiny window and in my experience often doesn’t work. Whereas the Windows XP search experience is simple, easy, reliable and fast (three fields, file name, content to search for, where to search). Sadly Microsoft has made search on Windows 8 even worse (right click on a search result and if you open the location you lose the search results – how is that useful?). And of course the start menu has gone.

Anyway, as the end date for Windows XP support, April 8th, loomed we started getting queries about our software that we’d never fielded before. Most were related to the User Account Control issues already described. People who had been using Windows XP and Windows XP x64 for development were abandoning ship, moving to Windows 7 or Windows 8. By the time we got to the start of March I knew we had to drop everything we were doing, annoy some of our existing customers by putting certain work on hold, and make our software run without administrator privileges. No User Account Dialogs!

Usually we issue one or two software releases per week. Or maybe one every two weeks. Our last software release was March 19th, over 10 weeks ago. This has been the longest break between software releases we’ve ever had. The scope of the work has been huge. We’ve had to write a service that the software talks to to do anything that would normally require administrator privileges. Then we had to identify every area that was impacted by the change in privilege levels. This meant some areas of the software had to be completely re-written. For example our Registry handling software now also has to be able to write to files and memory so that we can pass that data to the service so that the service can populate certains parts of the Registry on behalf of the software tool. Another area that has changed is shared memory. You can no longer just prefix everything with \Global and expect it to work. As such shared memory handling has had to change. This is particularly important if you are working with services. Copying files to certain locations is no longer possible. Writing files to certain locations is no longer possible. All these things we’ve had to address. We’ve had to update the installers to correctly install the service (and uninstall it first if we’re replacing it with an updated version). We’ve had to test on Windows XP, Windows x64 (to ensure our XP customers still get the experience they expect) and on Windows 7 and Windows 7 x64, Windows 8 to ensure everything works correctly. The different security handling for services between XP and Windows 7/8 causes some interesting problems to be worked around. We’ve tested everything to bits.

And along the way we’ve re-factored some of our software and also exposed and fixed some low level bugs in our hooking software (some of which are a side effect of recent optimisations in the way Microsoft build their operating system DLLs). Error reporting on the Diagnostic tab has improved.

Weren’t we dog-fooding?

You might think we weren’t dog-fooding. That is, using our own software to test our software. Yes, we do use our own tools.

But we were not using our tools in an environment that would cause these issues to be apparent to us. We had chosen to eat at our favourite restaurant rather than at the best restaurant in town. Having just spent two weeks on a cruise ship this analogy is sound, we quickly concluded that we preferred the Bistro over the fancy restaurant.

In fact we had deliberately chosen Windows XP x64 because we found it easier to use for the reasons already stated. In hindsight, although this was a good choice for ease of development it was a poor choice from the point of view of experiencing what the bleeding edge of our customers use.

Our new development environment

Our current development environment is Windows 7 but we’ll be moving to Windows 8 as soon as the new machines arrive. For Windows 8 we’ll be adding the Stardock Start8 Start menu. We’ve also written some search tools that although crude, allow us to search files more easily than using Windows 8 built in search functionality. We’re writing more tools to make our development work easier, even as Microsoft’s own UX efforts make what we want to do harder.


The move from administrator privileges required has been a time consuming, challenging experience. We’ve learned a lot along the way and had to change how our software works. However the result is that you, our customers, the users of our software now have an easier time using the software. I hope you agree the effort was worth it.


Panorama Theme by Themocracy