Backing up your web site in a few steps

by Ventsy Popov
Yesterday I decided to deal with something I’ve been postponing since the moment I installed the blog here. I can assume every hosting company has a way to back up the data it is hosting, including in my case. Yet, I am a bit more cautious when it comes to archiving my information in case it has to be later recovered… Along with prayers to computer gods not to allow such emergencies, I usually want be prepared as well. Here is a pretty easy way for a regular backup of a web site to a local machine:


First go and get BestSync 2011 (link here)
It’s a freeware app and once installed it will let you define a sync procedure between folders (local or not). At first you have to set the folders you need to sync, and the sync direction:
Then you have to define the desired schedule for the sync and even mark the task to be run as a Windows Service, so there won’t be any need to run BestSync at Windows startup:
Well so far so good, but I wanted my backup not only to be a sync process between two folders. My goal was to have daily backup archives. Following this idea I added an .exe file as an application that had to be ran every time the sync task is finished:
The cool thing here is how I created the .exe file, which had to do the simplest thing – to get all the data from the latest backup folder and copy it to a brand new folder named with the current date and time. I wanted super simple way to do it – a script for example. So I prepared the following .vbs:
Dim fso, objFolder
strComputer = "."

' Date and time

Set objWMIService = GetObject("winmgmts:\\" & strComputer & "\root\cimv2")
Set colItems = objWMIService.ExecQuery("Select * from Win32_OperatingSystem")

For Each objItem in colItems
    dtmLocalTime = objItem.LocalDateTime
    dtmMonth = Mid(dtmLocalTime, 5, 2)
    dtmDay = Mid(dtmLocalTime, 7, 2)
    dtmYear = Left(dtmLocalTime, 4)
    dtmHour = Mid(dtmLocalTime, 9, 2)
    dtmMinutes = Mid(dtmLocalTime, 11, 2)
    dtmSeconds = Mid(dtmLocalTime, 13, 2)

strDirectory = "D:\Backups\\" & dtmYear & dtmMonth & dtmDay & "_" & dtmHour & "." & dtmMinutes 
SET fso = CreateObject("Scripting.FileSystemObject")
Set objFolder = FSO.CreateFolder(strDirectory)
fso.CopyFolder "D:\Backups\\LatestBak", objFolder, True 
I tested it to be sure it does the job correctly. And here was the first obstacle – BestSync allowed a .bat file, .exe file or .cmd file only to be executed after a sync task. Batch file or cmd were not suitable, because I wanted everything to be in silent mode. So here is the trick - I used a little known IExpress application that can be found in C:\Windows\System32 to convert my .vbs to an exe. Here is a more in depth description how it works:

…well this is not more than 20-minute procedure to assure you’ve taken some disaster measures :). If you try it out, do not forget to test a restore scenario… backups usually work pretty well, but when it comes to restoring… right?

Tags: ,


SQL Server Integration Services - example of error handling

by Ventsy Popov

There was a saying about a student who constantly complained to his teacher of always having so much work to do, that time was never enough for him. “If you never stop working, when do you actually think of the things you should do and how you should do them?” was the question the teacher asked him. Following this thought when my daily tasks flow is at a high level of load, I usually try to stop once in a while and think of the bigger picture of what I am actually trying to achieve and whether I am doing it effectively.

For a couple of days now I have been trying to sit and play a bit with Microsoft SQL Server Integration Services (SSIS) error handling. The reason was that colleagues and I were working on a SSIS project and at a certain point the error analysis of migrated data became somehow a pain in the triple-letter word :).

We identified the following issues:

  1. We spent more time analyzing errors, than actually transferring the data
  2. Doing the above we tended to make mistakes in the statistics we provided for the client

Then we tried to nail the actual reasons for these things:

  1. We had too many error destination components and we had to go over each one, every time the migration was finished. This was slowing us.
  2. We did not have a common way of outputting the errors content in the destination components. This way we had a different approach of analysis for each one, which is an error-prone way of doing things.

Having this figured out, the next step was to set the goals:

  1. We need as less error destinations to log data in, as possible. One error output per data flow task was an excellent optimization for our case.
  2. We wanted a relatively easy data format to analyze over. So we targeted at least an excel sheet to output into, not just a flat file for instance.

So I’ve put some effort to make an example of a problematic (in the context of our case) package. In it we had two types of error outputs (all pointing to flat file destinations) – a) Business logic errors and b) Data manipulation errors:


Starting almost from the scratch:


The data flow task was transformed into this:


The key points here are:

  1. We have multiple error outputs (both - business logic and data manipulation) united into a single path flow.
  2. For every business logic type of error we add a custom reason (i.e. “Add Reason” component on the image).
  3. We have a Script component extracting the actual description of errors with the following code:
    public override void Input0_ProcessInputRow(Input0Buffer Row)
            if (Row.ErrorCode_IsNull == false)
                Row.ErrorDescription = ComponentMetaData.GetErrorDescription(Row.ErrorCode);
                    Row.ErrorColumnName = ComponentMetaData.InputCollection[0].InputColumnCollection[Row.ErrorColumn].Name;
                catch (Exception ex)
                    Row.ErrorColumnName = "Column Name retrieval failure";
                Row.ErrorDescription = Row.Reason;
  4. All information is gathered into an excel sheet for later analysis.

If someone wants to dig deeper, here is the source code of the package: (28.83 kb)

Integration Services

Elieff Center for Education and Culture - PHP on IIS

by Ventsy Popov
A while ago I was called by my boss Vladimir Tchalkov on the  phone with the offer to cover for him on a talk he had to give. I say an "offer", since I have to be honest and admit, that it was more of an opportunity for me, than doing a favour to Vlado who had an urgent travel to make at that time... In other words - something of a win-win situation for both of us. The topic was "PHP Applications Hosted on IIS" and although I had literally no time to prepare, most of the work was already done for me. With some counseling by Vlado and fooling around with test application I was ready to launch :).

The talk was actually part of a small Microsoft event about PHP integration in Microsoft technologies. The other presnetation on this topic was made by Svetlin Nakov. Since the audience consisted more of PHP guys than .NET ones, we kinda had to convince them there is a real deal in using PHP along with Microsoft products... Hope at least we inclined them on the idea of giving it a shot :).

 Here are some strokes of the raw material on my side of speaking:

Although CGI is a relatively easy way to delegate the generation of a web page to an executable it comes at a certain cost. Every time a command is called we pay the price of creating a new process, which can be a bit of a performance drawback.  ISAPI extensions on the other hand could be real fast (guessing they where developed properly), but require thread safety to be separately taken into account. On the top of that we cannot use scripting languages to create ISAPI extensions (or filters).


Here comes FastCGI 
Which we can say combines the good sides of both of the above:
 - A process is created on a first request, and then reused. Hence it is very fast.
 - Has a single-threaded execution, which is recommended for NON-thread safe PHP applications. This way we can count on stability as well.
How to Install
1) Well you can use Web Platform Installer and with just a few clicks have your environment ready,
2) You can head to a more tedious process of doing it by enabling FastCGI on IIS, downloading the latest version of PHP for Windows, and configuring IIS to handle PHP requests.

Good To Have in Mind  
After installing you might want to check out the web.config and configure the maximum requests being served, before a process is recycled, and adjust the maximum instances of a process ran on a single processor:

<application fullPath="C:\PHP\php-cgi.exe"    
Here is the full presnetation for PHP Applications on IIS.


Sofia University - Design Patterns course (Facade)

by Ventsy Popov

I intend to start something of a personal initiative and upload some materials on talks I held. Hopefully they can be useful to others. One of the courses I eagerly participated in was on the subject of Design Patterns a couple of years ago.

Façade was the first topic I had to cover and the talk was very well accepted by the students in Sofia University, even though I had very little experience in public speaking back then. Thinking about it - good preparation plus some funny moments played the main role for that. As for the more technical aspect of it, I will try to lead you through the main idea of the pattern:
From time to time it might happen that we come across a complicated system (in terms of being highly coupled) and we should somehow plug into it and use its features. God forbid the system was not developed by us, and then the misery is doubled :). Let me illustrate what I mean:


If we are in the tough position of creating the client, we will bump into:
   - the complexity of such a base system;
   - the high coupling of the classes;
   - the difficulties we hardly can foresee in supporting such a monster :).
So what we can do is, expose only the functionality we need for our client and in a way – forget about the rest of the hairy base system, reshaping the sketch like this:


In such way we can create a separate level of abstraction, which is much more easily “absorbable” by us developers.

Here is the presentation FacadePattern.pdf and the demos Having in mind the source code is very minimalistic example of a Façade pattern implementation – enjoy :).


Sofia .NET User Group - Open Data Protocol (OData)

by Ventsy Popov
Recently I had a talk for the Sofia .NET User Group.  Actually it was my first speak for the user group and I was happy to see that after a couple of years of irregular gatherings, folks are still interested in these events. Nadya Atanasova who is lately the main engine for organizing us geeks, is doing a great job in promoting the meetings.

The topic of my speak was Open Data Protocol and WCF Data Services. With a few surprises - for example video recording with Microsoft Live Meeting (which I happened to have no experience with), we were able to start and cut to the chase :). First we travelled a bit in time, tracking the history of OData through Astoria and ADO.NET Data Services until it became a separate web protocol. After that we pulled some data out of Netflix (super exploited jQuery and OData example by the way :)):
       $(document).ready(function () {
        function (data) {
            $.each(data.d.results, function (key, val) {
                var str = '<h3>' + val.Title  + '</h3>' + val.Abstract + '<br/>';



And of course in honor of Microsoft, we played with a little C# code, implementing a simple WCF Data Services with exception handling:
    public static void InitializeService(DataServiceConfiguration config)
        config.SetEntitySetAccessRule("*", EntitySetRights.All);
        // config.SetServiceOperationAccessRule("MyServiceOperation", ServiceOperationRights.All);
        config.DataServiceBehavior.MaxProtocolVersion = DataServiceProtocolVersion.V2;

    protected override void HandleException(HandleExceptionArgs args)
        args.Exception = new DataServiceException(

You can check out the slides and demos: