All posts in Uncategorized

Managing and tuning the performance of relational databases is a challenging task that requires significant expertise and time investment but now we have another tool in our arsenal to help us optimize our relational workloads.

Query Performance Insight allows you to spend less time troubleshooting database performance by providing the following:

  • Deeper insight into your databases resource (DTU) consumption.
  • The top DTU consuming queries, which can potentially be tuned for improved performance.
  • The ability to drill down into the details of a query.

For more information, visit the Query Performance Insight page on MSDN


The patterns & practices team has been working on developing Azure architecture guidance.

The first round of  guidance is now available to public at https://github.com/mspnp/azure-guidance. The purpose of this project is to provide architectural guidance to enable you to build and deploy world-class systems using Azure. These documents focus on the essential aspects of architecting systems to make optimal use of Azure, and summarize best practice for building cloud solutions. The current set of guidance documents contains the following items.

· API Design describes the issues that you should consider when designing a web API.

· API Implementation focuses on best practices for implementing a web API and publishing it to make it available to client applications.

· Autoscaling Guidance summarizes considerations for taking advantage of the elasticity of cloud-hosted environments

· Background Jobs Guidance describes the options available, and best practices for implementing tasks that should be performed in the background.

· Content Delivery Network (CDN) Guidance provides general guidance and good practice for using the CDN to minimize the load on your applications, and maximize availability and performance.

· Caching Guidance summarizes how to use caching with Azure applications and services to improve the performance and scalability of a system.

· Data Partitioning Guidance describes strategies that you can use to partition data to improve scalability, reduce contention, and optimize performance.

· Monitoring and Diagnostics Guidance provides guidance on how to track the way in which users utilize your system, trace resource utilization, and generally monitor the health and performance of your system.

· Retry General Guidance covers general guidance for transient fault handling in an Azure application.

· Retry Service Specific Guidance summarizes the retry mechanism features for the majority of Azure services, and includes information to help you use, adapt, or extend the retry mechanism for that service.

· Scalability Checklist summarizes best practices for designing and implementing scalable services and handling data management.

· Availability Checklist lists best practices for ensuring availability in an Azure application.

The authors state that this is a living project and they welcome feedback, suggestions, and other contributions to those items. So if you have any comments you can  join the gitter chat for questions or suggestions.


In April of 2013 the first Global Windows Azure Bootcamp was held at more than 90 locations around the globe! In March 2014 Athens joined the event and we scratched the surface of Cloud Computing ate some pizzas, had a ton of fun at the labs and gave out some pretty nifty swag.

GWAB

See what happened last year at Athens GWAB

This year we are doing it again. A one day deep dive class where participants can get up to speed on developing Cloud Computing Applications for Azure.

In addition to this great learning opportunity we will have another set of hands on labs, a lot more speakers and discussion topics and a lot more surprises (if you catch my drift 🙂 )…

Last but not least this year everyone can participate in a massive compute pool to perform breast cancer research!

If you haven’t registered yet, visit http://athens.azurebootcamp.net and register now to make sure we reserve a sit for you!


10835330_904691899541590_3754265912323387839_oLast week I had the opportunity to discuss Azure Mobile Services in a Code for fun meet up held by dotnetzone.gr.

I really enjoy these community driven events as they’re so much different from all the rest I’m presenting.

They’re more relaxed and interactive more like a discussion than a monologue,  between colleagues sharing their experiences on a particular technology.

It’s more about coding and so much less about PowerPoints. In this last event I only showed only 4 slides 2 of which were the title and the thank you one. The rest of the time I’ve spent it coding together with the participants.

We’ve organize them in such a way that each one picks up from where the previous one ended, so there’s a sense of continuum. For example we first built a Windows 8 App using HTML / TypeScript and in my session we used Azure Mobile Services to build a backend for it.

We’re ending these sessions with Pizzas 🙂

I really encourage you to be part of this, I can guarantee you, you’ll really enjoy it too. Plus if you want you can take the stage as well. We’re open to anyone who would like to share with us his experiences in building software.

For those of you who couldn’t attend the last session I’m sharing the source code of the Backend Azure Mobile service we built.


Microsoft.Net.Http was released today as a stable NuGet package. Yep, that’s right: You can finally start using the portable HttpClient 2.1 in production!

This NuGet package makes a standard set of HttpClient APIs available across a wider array of platforms, including Windows Phone 7.5 and higher, .NET Framework 4.0 and higher, and Windows Store. This enables you to share more .NET code across these platforms.

Read more about it in the announcement to the .NET Blog.


Now that my blogging engine is back on track it’s time for the long awaited third part of my Google Reader alternative series and time to show you how I implemented the server side code to read news for subscriptions I stored in SQL Azure in part 2.

Since Windows Azure Mobile Services allows you to schedule and run recurring jobs in your backend, I thought that my Google Reader service could define some server script code that would be executed based on a schedule I defined and could download the RSS entries for each of my subscriptions. So I’ve created a new scheduled job by clicking the Scheduler tab, then +Create button at the bottom of the page.

scheduled

In the scheduler dialog, I entered DownloadPosts as the Job Name, I set the schedule interval and units, and then clicked the check button.

Once the job was created I clicked the script tab at the top in order to write my business logic to fetch the RSS feeds.

image

The code I wrote was pretty simple, for each subscription stored in my Feeds table I called the url stored in the FeedUrl field of the Feed entry.

function DownloadPosts() {
    console.log("Running post import.");
    var feedsTable = tables.getTable("feed");
    feedsTable.read({
        success: function(results) {
            results.forEach(function(feed){
                importPosts(feed);
            });
        }});
}
The importPosts function was responsible for making the actual http call and storing the data to the posts table.
function importPosts(feed) { 
    var req = require(''request''); 
    console.log("Requesting item ''%j''.", feed); 
    req.get({
            url: "" 
        }, 
        function(error, result, body) { 
            //1. Parse body
            //2. Store in the database
        } 
    ); 
}

Unfortunately Windows Azure Mobile Services do not provide an XML node module yet (according to the team they’re planning to put more modules in the near future) that could parse the xml formed RSS feed returned by the subscriptions, there is only a JSON module available to consume Json feeds. So at this point I had two options, I had to either move my post retrieval logic to my client or build a proxy to convert the xml RSS feed to a Json one. I decided to go with the second one because of two reasons.

  • If I moved the download logic to the client then this would have to be replicated to each client I used.
  • Parsing the xml feed server side is going to be available in the near future so there is no point in replicating the code, it’s going to be relatively easy to replace the proxy and actually parse the feed in the scheduled job when the XML node module is released.

So I built a simple CloudService which contains a very basic web role to convert RSS feeds to Json ones. The Web role contains an HTTP Handler that is passed a Feed ID as a querystring argument. It then looks up the ID in the Subscriptions Table finds the Feed Url and calls it to get the xml data. It then uses the Newtonsoft.Json object serializer to serialize objects that contain the data from the parsed RSS feeds. The code looks like this:

using System;
using System.Collections.Generic;
using System.Net;
using System.ServiceModel.Syndication;
using System.Web;
using System.Linq;
using System.Xml.Linq;
using CloudReader.Model;
using System.Diagnostics;

namespace XmlToJsonApi
{
    public class Convert : IHttpHandler
    {
        const string CONTENT_NAMESPACE = "http://purl.org/rss/1.0/modules/content/";
        const string MOBILESERVICE_ENDPOINT = "https://cloudreader.azure-mobile.net/tables/feed/{0}";

        public int FeedID
        {
            get 
            { 
                return GetFromQueryString("id", 0); 
            }
        }

        /// <summary>
        /// You will need to configure this handler in the Web.config file of your 
        /// web and register it with IIS before being able to use it. For more information
        /// see the following link: http://go.microsoft.com/?linkid=8101007
        /// </summary>
        #region IHttpHandler Members

        public bool IsReusable
        {
            // Return false in case your Managed Handler cannot be reused for another request.
            // Usually this would be false in case you have some state information preserved per request.
            get { return false; }
        }

        public void ProcessRequest(HttpContext context)
        {
            SyndicationFeed feed = GetFeed(FeedID);

            var datafeed = ParseFeed(feed);

            string data = Newtonsoft.Json.JsonConvert.SerializeObject(datafeed.Select(pe => new 
                {
                    Author = pe.Author,
                    CommentsCount = 0,
                    Description = pe.Description,
                    Content = pe.Content,
                    FeedId = pe.FeedId,
                    IsRead = pe.IsRead,
                    Link = pe.Link,
                    PubDate = pe.PubDate,
                    Stared = pe.Stared,
                    Title = pe.Title
                }), Newtonsoft.Json.Formatting.Indented);

            context.Response.ContentType = "application/json";
            context.Response.Write(data);
        }

        private List<Post> ParseFeed(SyndicationFeed feed)
        {
            List<Post> posts = new List<Post>();
            if (feed != null)
            {
                // Use the feed   
                foreach (SyndicationItem item in feed.Items)
                {
                    try
                    {
                        Post feedItem = new Post()
                        {
                            Author = string.Join(", ", item.Authors.Select(sp => sp.Name)),
                            CommentsCount = 0,
                            Description = item.Summary != null ? item.Summary.Text : "",
                            Content = GetContent(item),
                            FeedId = FeedID,
                            IsRead = false,
                            Link = item.Links.Select(l => l.Uri.AbsoluteUri).FirstOrDefault(),
                            PubDate = item.PublishDate.UtcDateTime,
                            Stared = false,
                            Title = item.Title.Text
                        };
                        posts.Add(feedItem);
                    }
                    catch (Exception exception)
                    {
                        HandleException(exception);
                    }
                }
            }
            return posts;
        }

        private string GetContent(SyndicationItem item)
        {
            string content = null;
            if (item.Content != null)
            {
                content = item.Content.ToString();
            }
            else if (item.ElementExtensions.Where(e => e.OuterNamespace == CONTENT_NAMESPACE && e.OuterName == "encoded").Count() > 0)
            {
                var elem = item.ElementExtensions.Where(e => e.OuterNamespace == CONTENT_NAMESPACE && e.OuterName == "encoded").FirstOrDefault();
                if (elem != null)
                {
                    content = elem.GetObject<string>();
                }
            }
            return content;
        }

        private SyndicationFeed GetFeed(int feedId)
        {
            try
            {
                using (WebClient client = new WebClient())
                {
                    var buffer = client.DownloadString(string.Format(MOBILESERVICE_ENDPOINT, feedId));
                    Feed feed = Newtonsoft.Json.JsonConvert.DeserializeObject<Feed>(buffer);
                    XDocument doc = XDocument.Load(feed.XmlUrl);
                    return SyndicationFeed.Load(doc.CreateReader());
                }
            }
            catch (Exception ex)
            {
                HandleException(ex);
            }
            return null;
        }

        private void HandleException(Exception ex)
        {
            Trace.WriteLine(ex.Message);
            Trace.WriteLine(ex.StackTrace);
        }

        public int GetFromQueryString(string requestParameter, int defaultValue)
        {
            string value = HttpContext.Current.Request.QueryString[requestParameter];

            if (!String.IsNullOrEmpty(value))
            {
                int iValue = -1;

                if (Int32.TryParse(value, out iValue))
                {
                    return iValue;
                }
            }

            return defaultValue;
        }
        #endregion
    }
}

As you can see the handler uses the REST API of my Mobile Service to access the Feed, there’s no secret key involved or any MobileService managed API call, I’m just calling https://cloudreader.azure-mobile.net/tables/feed/{0} passing the ID and I’m getting back a Json serialized Feed Object. In order for this to work I had to change the security permissions on the Feed table and allow Read access to everyone.

image

Once I deployed and tested my proxy cloud service I revisited my Windows Azure Mobile Service scheduled job to complete it. Now the scheduled job calls the proxy for each Feed to convert the RSS feed to a json one and then if the post is not already in the database inserts it. The final code looks like this:

function importPosts(feed) { 
    var req = require(''request''); 
    console.log("Requesting item ''%j''.", feed); 
    req.get({ 
            url: "http://cloudrssreader.cloudapp.net/xmltojson.ashx?id=" + feed.id, 
            headers: {accept : ''application/json''} 
        }, 
        function(error, result, body) { 
            var json = JSON.parse(body); 
            json.forEach(function(post) { 
                var postsTable = tables.getTable("post");
                postsTable.where({Link : post.Link}).read({success : function(existingPosts){
                    if (existingPosts.length === 0){
                        console.log("Inserting item ''%j''.", post); 
                        postsTable.insert(post);
                    }
                }});
            }); 
        } 
    ); 
}

Once I had my scheduled job enabled, I had new posts being stored in my database every hour ready to be presented in my clients.

Stay tuned to read, in part 4 of this series how I’ve consumed my posts in the Windows 8 client I’ve talked about in part 1.


mvp_logoGreat news arrived yesterday and it wasn’t an April fools joke 

I was re-awarded the Windows Azure MVP title. This is my 7th consecutive year as an MVP and It feels great being part of this community not only because of the benefits (which are great too :D) but also because you get the chance to actually reach out and help other people.

As I promised last year and the year before that I will try to engage more in the community through my blogging, speaking, and writing on Windows Azure and hope that I will meet MVP program’s high entrance standards next year as well :D.

For those that don’t know what an MVP really means, check out the MVP Program site and then read my interview at Alessandro’s (my MVP lead) blog as well as my interview at MicrosoftFeed.

 


I’ve been working with windows 8 and Metro style apps since Microsoft released Its first private beta, I had to format my pc several times during this time due to bugs or incompatibilities with my hardware but it was worth it as through this I gained very useful experience and was able to get a head start on windows 8 application development.

So today I’m happy to announce two brand new free windows 8 applications available at the Windows store

MeteoGR

Meteo is the leading meteorological website in Greece backed by the National Observatory of Athens providing accurate and precision weather forecasts for Greece.  So a while back we approached the guys there in order to build a proof of concept application that would present their data using the Metro Style experience.

After straggling for a while with the APIs and the application certification team of the Windows Store :-), the application hit the store on October and has already achieved almost 5000 downloads.

The application features almost all of the windows 8 available application features such us Live Tiles, Roaming Settings, Semantic Zoom, Settings Contract and more and is built using XAML – C# and takes advantage of the MVVM pattern and of course the application is backed by Microsoft Windows Azure.

You can download the app from here http://apps.microsoft.com/webpdp/en-us/app/meteogr/1869f228-540a-469a-9581-64c4771378e8


Zougla

Zougla is one of the largest and probably the most visited news portal in Greece. Beside informing the visitor with all the latest news in Greece and the rest of the world, Zougla offers visitors a wide variety of services ranging from news and weather to online TV and Radio. They even run their own marketplace.

Being tech savvy and happy to work on new ideas they gladly accepted to work on a new experience in Windows 8. The result is the just the released application http://apps.microsoft.com/webpdp/app/zougla/ffda30c7-357e-4207-a437-be60871dfc9b

The application features Live Tiles, Roaming Settings, Share Contracts, Settings Contract, Video On Demand, Live Smooth Streaming  and more are planed for the next version. Lastly it’s built using XAML and C# and the MVVM pattern as well.

Download and install them. I’d love to hear what you think. Hope you like them.


Five years ago my primary hard disk failed taking some of my precious data like family pictures, videos, personal files and code away for ever. I was able to recover some but had to pay a hefty fee to the service guys. After the “melt down incident” I’ve decided to keep triplicates of my data to 2 more disks to avoid this from ever happening again. Setting this up though was quite a pain since I use a laptop and I’m constantly on the move, so I had to remember to sync everything the minute I got home to avoid losing anything.

From time to time I considered moving my data to the cloud to avoid all the hassle and worries of syncing the data but to also be able to access those from where ever I was. There were a couple of services I’ve looked at but two things were keeping me from going through with this, the price, and the trust (not that someone would steal my data – who would care about my children photos after all, but what happens if the company is sold, closes, its hardware fails and so on).

Microsoft’s SkyDrive would have been the perfect solution for my problems, it was free (now it’s very cheap – free for 25GB and 100$/year for 125GB) and I could trust Microsoft with my data, but it didn’t have a desktop client that would automatically sync those to the cloud, well… that is till last week. Last week Microsoft released preview versions of SkyDrive for Windows and Mac, along with updates for iOS and Windows Phone. With SkyDrive on my desktop, I can now store and access files in the cloud right from any of my PCs or anywhere in the world and not worry of loosing anything.

But that’s not all, what’s most important is that together with the release of SkyDrive desktop app Microsoft released a set of REST based APIs – the next version of the Live Connect APIs and the newly created Live SDK that can be consumed by any platform and device bringing my data to all my devices Desktop, Windows 8 Tablet and Windows Mobile Phone 7.

0654_User_Content_Model_thumb_05057890

Here is a sample HTTP request to retrieve a list of a user’s entire set of folders in SkyDrive:

GET https://apis.live.net/V5.0/me/skydrive/files?access_token=ACCESS_TOKEN 

HTTP/1.1

User-Agent: Fiddler

Host: apis.live.net 

The above request returns the following JSON result set for my SkyDrive account: 

{"data": 

 [{

    "id": "folder.616444ee7a34f417.616444EE7A34F417!12045",

    "from":{"name": "Dare Obasanjo",

    "id": "616444ee7a34f417"},

    "name": "Wave 4 Feedback",

    "description": null,

    "parent_id": "folder.616444ee7a34f417",

    "upload_location": "https://apis.live.net/v5.0/folder.616444ee7a34f417.616444EE7A34F417!12045/files/",

    "count": 14,

    "link": "https://skydrive-df.live.com/redir.aspx?cid=616444ee7a34f417&page=view&resid=616444EE7A34F417!12045&parid=616444EE7A34F417!1967",

    "type": "album",

    "shared_with":

    {

       "access": "Everyone (public)"

    },

    "created_time": "2010-07-14T13:28:48+0000",

    "updated_time": "2011-07-18T03:40:07+0000"

 }

]} 

Uploading files is similarly straightforward. Applications can use either HTTP PUT or POST requests to upload documents, photos or videos to SkyDrive at the folder’s upload location. It should be noted that not all file formats are supported for upload to SkyDrive.

Below is an example of uploading a text file using HTTP PUT:

PUT https://apis.live.net/v5.0/me/skydrive/files/HelloWorld.txt?access_token=ACCESS_TOKEN 

Hello, World! 

and here’s what the same upload looks like using HTTP POST:

POST https://apis.live.net/v5.0/me/skydrive/files?access_token=ACCESS_TOKEN 

Content-Type: multipart/form-data; 

boundary=A300x

--A300x

Content-Disposition: form-data; 

name="file"; 

filename="HelloWorld.txt"

Content-Type: application/octet-stream 

Hello, World!

--A300x--

These are just a few examples of how easy it is to interact with SkyDrive using nothing but regular HTTP. Other operations such as editing, copying or sharing files are similarly straightforward. For an easy way to try the Live Connect REST API for yourself, visit the interactive SDK.

So what do you think, ready to write your application using SkyDrive as the storage engine?


Well as mentioned in my previous post last Tuesday I had the chance to speak about Prism and Unity at a DotNetZone event.

I believe the talk went really well, at least the attendees seemed very interested in those two frameworks. Unfortunately time was rather short for covering both topics so I feel I need to apologize for going a bit too fast.

Anyway I’m putting up my slide deck and demo code as promised in case you want to dive into it and have a go your self with Prism.

Don’t hesitate to contact me if you need more help in these.