With Global Windows Azure Bootcamp only a few days away I just received confirmation of all the things we’re going to be giving away this Saturday and I must say that the list is long and contains a lot of interesting oferings from our sponsors.

MyGet http://www.myget.org/ Every attendee gets 2 free months1 winner at each location win a 1 year subscription
JetBrains http://www.jetbrains.com/ 1 winner at each location win a free personal license to one of JetBrain’s .NET Products or IDEs (their choice) See www.jetbrains.com for a list, or it is included in the giveaway slides.
Blue Syntax http://www.bluesyntax.net 3 winners at each location win an Enterprise license to Enzo Cloud Backup
Infragistics http://www.infragistics.com/ 1 winner at each location win a Infragistics Ultimate License
Opsgility http://www.opsgility.com/ 1 winner at each location wins a voucher (up to $200) for an online course.
Cerebrata http://www.cerebrata.com/ 5 winners at each location win a license to Azure Management Studio
CloudBerry http://www.cloudberrylab.com/ 5 winners at each location win a license of their choice to CloudBerry Drive, Explorer Pro or Backup
Paraleap https://www.paraleap.com/ Every attendees gets 1 free month of Azure Watch
CODE Magazine http://www.codemag.com/magazine/ Every attendee gets a digital copy of the March/April issue of CODE MagazineEvery attendee gets 1 year free digital subscription to CODE Magazine
Zudio https://zud.io/ Every Attendee gets 6 months free
Stackify http://www.stackify.com/ Every attendee gets 1 free license. Install it and get a TShirt sent to you.
Wintellect Now! http://www.wintellectnow.com/ Every attendee gets a 2 week subscription. If they register before April 25th they can get a 50% discount.
FacetFlow https://facetflow.com/ Every attendee gets 2 free months when they sign up past the trial.
Appveyor http://www.appveyor.com/ Every attendee gets 2 free months with the purchase of any plan.

And that’s not all… register and find out what the big prize of GWAB Greece is…


Registrations for the Global Windows Azure bootcamp in Athens are now open.

Now is the time for you to register in order to get up to speed on developing for Windows Azure. The class includes presenters and trainers with deep real world experience with Windows Azure, as well as a series of labs so you can practice what you just learned.

Hurry cause seats are filling up fast.


bootcamp-300x202In April of 2013 a bunch of MVPs held the first Global Windows Azure Bootcamp at more than 90 locations around the globe! Unfortunately Greece was not in the list of participants mainly due to bad timing.

This year though I’ve decided to organize the event for Greece, so we plan to offer up a one day deep dive class to help thousands of people get up to speed on developing Cloud Computing Applications for Windows Azure. In addition to this great learning opportunity the hands on labs will feature pooling a huge global compute farm to perform research for charity!

If you’re not familiar with the event you can learn more about it from this site http://global.windowsazurebootcamp.com/

Call for action

So, if you’re interested and you’d like to help in organizing this event for Greece I’m currently looking for a co-organizer and of course if you’d like to speak and have a topic you wish to present drop me a line and we’ll talk.

Looking forward to see you all there.

0009It’s been almost a month since this year’s ITPro|DevConnections and it’s been at least a couple of weeks since this post has been sitting in my drafts folder waiting to be published. A lot of things have changed this summer (but this is the topic of another post) and things have been pretty hectic so I haven’t been able to keep up with my blogging. And haven’t been able to tell you all about this years’ ITPro|DevConnections.

For those of you that are still not aware of what ITPro|DevConnections is, every year around November the IT professionals community, www.autoexec.gr, in cooperation with the developers’ community, www.dotnetzone.gr, present a two days event with lots of parallel tracks! The event has been running for four years now, each year with more success. During these two days we get to speak a lot about all the cutting edge Microsoft technologies meet and discuss with fellow professionals and generally have a great time.  If you speak Greek then you can also hear how the event was captured by one of the largest TV networks in Greece (ANT1)

To organize for these two days we spend the whole year around planning and preparing for it. As well as try to find the necessary funds, as nothing is free, to minimize the risk of having to put money from our pocket. It comes without saying that it requires a lot of energy and it can get really stressful at times but it gets very rewarding at the end.

This year, around 600 IT professionals and developers attended the event, making it the largest community organized event in Greece. Although we faced a number of issues during the event, I believe that everyone was satisfied at the end with the quality of the content offered. Surely we’re learning from our mistakes and we’ll make ITPro|DevConnections even better next year.

My Session this year was about searching in the cloud. Boost your searching with Windows Azure.

Talking with a lot of customers I’ve discovered that one of the major pain points when developing or migrating their applications to Windows Azure is searching. A large majority currently used SQL Server Full Text Search for searching text in their database. Unfortunately SQL Server Full Text Search is not currently supported in Windows Azure. So in my session we explored our searching options and discussed the benefits and drawbacks of each solution.

You can find my slide deck here…

I’m looking forward for next years event.

Summernote is a simple, clean and flexible WYSIWYG Editor that is built on top of jQuery and Bootstrap.

It supports Bootstrap’s both active versions (2 and 3) and has keyboard shortcuts for all major tasks.

Summernote WYSIWYG Editor

There is a powerful API which provides lots of customization options in means of design (width, height, active items..) and functionality.

The project also has integration samples for major scripting languages (PHP, Ruby, Django, Nodejs).

It was only last week, when I talked about Windows Azure mobile services in a www.dotnetzone.gr event at Microsoft, that I argued about the need business logic – data store separation and it seems that ScottGu might have been listening Smile

Today ScottGu announced a bunch of major ultra cool new features for Windows Azure Mobile Services which are:

  • Custom API support

  • Git Source Control support
  • Node.js NPM Module support
  • A .NET API via NuGet
  • Free 20MB SQL Database Option for Mobile Services and Web Sites
  • Android Broadcast Push Notification Support

If you want to find more about how these new features work visit ScottGu’s blog post or drop by next week at the TechEd Europe expo to show you Winking smile

I’ve got only one more thing to say, if you’re a mobile app /windows 8 developer you should definitely have a look …

What an honor being part of this team running the largest Tech event in Greece.

Preparations for this year’s event have started.

Stay tuned…

Microsoft.Net.Http was released today as a stable NuGet package. Yep, that’s right: You can finally start using the portable HttpClient 2.1 in production!

This NuGet package makes a standard set of HttpClient APIs available across a wider array of platforms, including Windows Phone 7.5 and higher, .NET Framework 4.0 and higher, and Windows Store. This enables you to share more .NET code across these platforms.

Read more about it in the announcement to the .NET Blog.

Now that my blogging engine is back on track it’s time for the long awaited third part of my Google Reader alternative series and time to show you how I implemented the server side code to read news for subscriptions I stored in SQL Azure in part 2.

Since Windows Azure Mobile Services allows you to schedule and run recurring jobs in your backend, I thought that my Google Reader service could define some server script code that would be executed based on a schedule I defined and could download the RSS entries for each of my subscriptions. So I’ve created a new scheduled job by clicking the Scheduler tab, then +Create button at the bottom of the page.


In the scheduler dialog, I entered DownloadPosts as the Job Name, I set the schedule interval and units, and then clicked the check button.

Once the job was created I clicked the script tab at the top in order to write my business logic to fetch the RSS feeds.


The code I wrote was pretty simple, for each subscription stored in my Feeds table I called the url stored in the FeedUrl field of the Feed entry.

function DownloadPosts() {
    console.log("Running post import.");
    var feedsTable = tables.getTable("feed");
        success: function(results) {
The importPosts function was responsible for making the actual http call and storing the data to the posts table.
function importPosts(feed) { 
    var req = require(''request''); 
    console.log("Requesting item ''%j''.", feed); 
            url: "" 
        function(error, result, body) { 
            //1. Parse body
            //2. Store in the database

Unfortunately Windows Azure Mobile Services do not provide an XML node module yet (according to the team they’re planning to put more modules in the near future) that could parse the xml formed RSS feed returned by the subscriptions, there is only a JSON module available to consume Json feeds. So at this point I had two options, I had to either move my post retrieval logic to my client or build a proxy to convert the xml RSS feed to a Json one. I decided to go with the second one because of two reasons.

  • If I moved the download logic to the client then this would have to be replicated to each client I used.
  • Parsing the xml feed server side is going to be available in the near future so there is no point in replicating the code, it’s going to be relatively easy to replace the proxy and actually parse the feed in the scheduled job when the XML node module is released.

So I built a simple CloudService which contains a very basic web role to convert RSS feeds to Json ones. The Web role contains an HTTP Handler that is passed a Feed ID as a querystring argument. It then looks up the ID in the Subscriptions Table finds the Feed Url and calls it to get the xml data. It then uses the Newtonsoft.Json object serializer to serialize objects that contain the data from the parsed RSS feeds. The code looks like this:

using System;
using System.Collections.Generic;
using System.Net;
using System.ServiceModel.Syndication;
using System.Web;
using System.Linq;
using System.Xml.Linq;
using CloudReader.Model;
using System.Diagnostics;

namespace XmlToJsonApi
    public class Convert : IHttpHandler
        const string CONTENT_NAMESPACE = "http://purl.org/rss/1.0/modules/content/";
        const string MOBILESERVICE_ENDPOINT = "https://cloudreader.azure-mobile.net/tables/feed/{0}";

        public int FeedID
                return GetFromQueryString("id", 0); 

        /// <summary>
        /// You will need to configure this handler in the Web.config file of your 
        /// web and register it with IIS before being able to use it. For more information
        /// see the following link: http://go.microsoft.com/?linkid=8101007
        /// </summary>
        #region IHttpHandler Members

        public bool IsReusable
            // Return false in case your Managed Handler cannot be reused for another request.
            // Usually this would be false in case you have some state information preserved per request.
            get { return false; }

        public void ProcessRequest(HttpContext context)
            SyndicationFeed feed = GetFeed(FeedID);

            var datafeed = ParseFeed(feed);

            string data = Newtonsoft.Json.JsonConvert.SerializeObject(datafeed.Select(pe => new 
                    Author = pe.Author,
                    CommentsCount = 0,
                    Description = pe.Description,
                    Content = pe.Content,
                    FeedId = pe.FeedId,
                    IsRead = pe.IsRead,
                    Link = pe.Link,
                    PubDate = pe.PubDate,
                    Stared = pe.Stared,
                    Title = pe.Title
                }), Newtonsoft.Json.Formatting.Indented);

            context.Response.ContentType = "application/json";

        private List<Post> ParseFeed(SyndicationFeed feed)
            List<Post> posts = new List<Post>();
            if (feed != null)
                // Use the feed   
                foreach (SyndicationItem item in feed.Items)
                        Post feedItem = new Post()
                            Author = string.Join(", ", item.Authors.Select(sp => sp.Name)),
                            CommentsCount = 0,
                            Description = item.Summary != null ? item.Summary.Text : "",
                            Content = GetContent(item),
                            FeedId = FeedID,
                            IsRead = false,
                            Link = item.Links.Select(l => l.Uri.AbsoluteUri).FirstOrDefault(),
                            PubDate = item.PublishDate.UtcDateTime,
                            Stared = false,
                            Title = item.Title.Text
                    catch (Exception exception)
            return posts;

        private string GetContent(SyndicationItem item)
            string content = null;
            if (item.Content != null)
                content = item.Content.ToString();
            else if (item.ElementExtensions.Where(e => e.OuterNamespace == CONTENT_NAMESPACE && e.OuterName == "encoded").Count() > 0)
                var elem = item.ElementExtensions.Where(e => e.OuterNamespace == CONTENT_NAMESPACE && e.OuterName == "encoded").FirstOrDefault();
                if (elem != null)
                    content = elem.GetObject<string>();
            return content;

        private SyndicationFeed GetFeed(int feedId)
                using (WebClient client = new WebClient())
                    var buffer = client.DownloadString(string.Format(MOBILESERVICE_ENDPOINT, feedId));
                    Feed feed = Newtonsoft.Json.JsonConvert.DeserializeObject<Feed>(buffer);
                    XDocument doc = XDocument.Load(feed.XmlUrl);
                    return SyndicationFeed.Load(doc.CreateReader());
            catch (Exception ex)
            return null;

        private void HandleException(Exception ex)

        public int GetFromQueryString(string requestParameter, int defaultValue)
            string value = HttpContext.Current.Request.QueryString[requestParameter];

            if (!String.IsNullOrEmpty(value))
                int iValue = -1;

                if (Int32.TryParse(value, out iValue))
                    return iValue;

            return defaultValue;

As you can see the handler uses the REST API of my Mobile Service to access the Feed, there’s no secret key involved or any MobileService managed API call, I’m just calling https://cloudreader.azure-mobile.net/tables/feed/{0} passing the ID and I’m getting back a Json serialized Feed Object. In order for this to work I had to change the security permissions on the Feed table and allow Read access to everyone.


Once I deployed and tested my proxy cloud service I revisited my Windows Azure Mobile Service scheduled job to complete it. Now the scheduled job calls the proxy for each Feed to convert the RSS feed to a json one and then if the post is not already in the database inserts it. The final code looks like this:

function importPosts(feed) { 
    var req = require(''request''); 
    console.log("Requesting item ''%j''.", feed); 
            url: "http://cloudrssreader.cloudapp.net/xmltojson.ashx?id=" + feed.id, 
            headers: {accept : ''application/json''} 
        function(error, result, body) { 
            var json = JSON.parse(body); 
            json.forEach(function(post) { 
                var postsTable = tables.getTable("post");
                postsTable.where({Link : post.Link}).read({success : function(existingPosts){
                    if (existingPosts.length === 0){
                        console.log("Inserting item ''%j''.", post); 

Once I had my scheduled job enabled, I had new posts being stored in my database every hour ready to be presented in my clients.

Stay tuned to read, in part 4 of this series how I’ve consumed my posts in the Windows 8 client I’ve talked about in part 1.

As you may have already read, a couple of months back I migrated my blog to Windows Azure WebSites and changed my blog engine to WordPress. The WordPress gallery template comes with a MySQL database 20MB free offer from ClearDB which was more than I needed (or so I thought) so I went alone and used it.

Lately though I”ve been experiencing a strange behavior, at undeterminable intervals my changes stopped being persisted at the database. After getting in contact with the ClearDB guys, I learned that the problem was that the MySQL engine was generating temp tables in order to facilitate the queries being made to it and the size of the data in these tables was greater from my quota leading to a block. The conversation reached a point where the ClearDB guy said

…I”m sorry for the inconvenience. If you”re not able to optimize your queries, and I gather that you”re using a stock WordPress install, the solution is simply to upgrade to the next service tier… (You can read the full transcript here)

I got so frustrated by this that I decided to investigate if moving to SQL Azure was an option and guess what… you can easily use SQL Azure with your WordPress Azure WebSite here”s how.


Step 1

Create a new WordPress site using WebMatrix localy by clicking new and selecting App Gallery from the menu.

Step 2

Download WP Db Abstraction. This plugin provides db access abstraction and SQL dialect abstraction for SQL Server. It is an mu (Must-Use) plugin AND also a db.php drop-in.

Step 3

Put the wp-db-abstraction.php and the wp-db-abstraction directory to wp-content/mu-plugins.  This should be parallel to your regular plugins directory.  If the mu-plugins directory does not exist, you must create it. Put the db.php file from inside the wp-db-abstraction.php directory to wp-content/db.php. Rename the wp-config.php file in the root folder to wp-config_old.php since this is going to be written by the setup wizard at a later step.

Step 4

Publish your newly created WordPress site to an empty Azure WebSite slot.

Step 5

Create a new SQL Azure database

Step 6

Once the site is published visit http://your_wordpress_url/wp-content/mu-plugins/wp-db-abstraction/setup-config.php to generate your wp-config.php file and follow the instructions to connect WordPress with the database you created in the previous step. In the database type field you’ll have to select BDO_SQL as the type to connect to SQL Azure. Note that WordPress should be able to connect to the database otherwise you”ll get an error.

Step 7

Complete the normal WordPress installation

Step 8

There is a known bug in the  WP Db Abstraction that causes your posts and media files to not appear in their corresponding lists in the admin. You can read more about it here. So you”ll have to edit the function is translate_limit in the wp-db-abstraction/translations/sqlsrv/translations.php file

// Check for true offset
if ( (count($limit_matches) == 5  )  && $limit_matches[1] != ''0'' ) {
    $true_offset = true;
} elseif ( (count($limit_matches) == 5 )  && $limit_matches[1] == ''0'' ) {
    $limit_matches[1] = $limit_matches[4];

The “$limit_matches” returns 6 items and in a validation is asking for 5 that”s all.

// Check for true offset
if ( (count($limit_matches) == 5  )  && $limit_matches[1] != ''0'' ) {
    $true_offset = true;
} elseif ( (count($limit_matches) >= 5 )  && $limit_matches[1] == ''0'' ) {
    $limit_matches[1] = $limit_matches[4];

just change the == for a >= and it works like a charm.

If you already had a WordPress installation like I did then you can easily migrate your data to your new SQL Azure backed installation. You just export all your data from you old site and import them to the new one.

Now, with half the price (4.99$ per month) of what I would have had to pay ClearDB to upgrade to the next service level I”ve got all the space I”ll ever going to need (up to 100MB) and all available for my data not temp.