Archive for May, 2013

Microsoft.Net.Http was released today as a stable NuGet package. Yep, that’s right: You can finally start using the portable HttpClient 2.1 in production!

This NuGet package makes a standard set of HttpClient APIs available across a wider array of platforms, including Windows Phone 7.5 and higher, .NET Framework 4.0 and higher, and Windows Store. This enables you to share more .NET code across these platforms.

Read more about it in the announcement to the .NET Blog.


Now that my blogging engine is back on track it’s time for the long awaited third part of my Google Reader alternative series and time to show you how I implemented the server side code to read news for subscriptions I stored in SQL Azure in part 2.

Since Windows Azure Mobile Services allows you to schedule and run recurring jobs in your backend, I thought that my Google Reader service could define some server script code that would be executed based on a schedule I defined and could download the RSS entries for each of my subscriptions. So I’ve created a new scheduled job by clicking the Scheduler tab, then +Create button at the bottom of the page.

scheduled

In the scheduler dialog, I entered DownloadPosts as the Job Name, I set the schedule interval and units, and then clicked the check button.

Once the job was created I clicked the script tab at the top in order to write my business logic to fetch the RSS feeds.

image

The code I wrote was pretty simple, for each subscription stored in my Feeds table I called the url stored in the FeedUrl field of the Feed entry.

function DownloadPosts() {
    console.log("Running post import.");
    var feedsTable = tables.getTable("feed");
    feedsTable.read({
        success: function(results) {
            results.forEach(function(feed){
                importPosts(feed);
            });
        }});
}
The importPosts function was responsible for making the actual http call and storing the data to the posts table.
function importPosts(feed) { 
    var req = require(''request''); 
    console.log("Requesting item ''%j''.", feed); 
    req.get({
            url: "" 
        }, 
        function(error, result, body) { 
            //1. Parse body
            //2. Store in the database
        } 
    ); 
}

Unfortunately Windows Azure Mobile Services do not provide an XML node module yet (according to the team they’re planning to put more modules in the near future) that could parse the xml formed RSS feed returned by the subscriptions, there is only a JSON module available to consume Json feeds. So at this point I had two options, I had to either move my post retrieval logic to my client or build a proxy to convert the xml RSS feed to a Json one. I decided to go with the second one because of two reasons.

  • If I moved the download logic to the client then this would have to be replicated to each client I used.
  • Parsing the xml feed server side is going to be available in the near future so there is no point in replicating the code, it’s going to be relatively easy to replace the proxy and actually parse the feed in the scheduled job when the XML node module is released.

So I built a simple CloudService which contains a very basic web role to convert RSS feeds to Json ones. The Web role contains an HTTP Handler that is passed a Feed ID as a querystring argument. It then looks up the ID in the Subscriptions Table finds the Feed Url and calls it to get the xml data. It then uses the Newtonsoft.Json object serializer to serialize objects that contain the data from the parsed RSS feeds. The code looks like this:

using System;
using System.Collections.Generic;
using System.Net;
using System.ServiceModel.Syndication;
using System.Web;
using System.Linq;
using System.Xml.Linq;
using CloudReader.Model;
using System.Diagnostics;

namespace XmlToJsonApi
{
    public class Convert : IHttpHandler
    {
        const string CONTENT_NAMESPACE = "http://purl.org/rss/1.0/modules/content/";
        const string MOBILESERVICE_ENDPOINT = "https://cloudreader.azure-mobile.net/tables/feed/{0}";

        public int FeedID
        {
            get 
            { 
                return GetFromQueryString("id", 0); 
            }
        }

        /// <summary>
        /// You will need to configure this handler in the Web.config file of your 
        /// web and register it with IIS before being able to use it. For more information
        /// see the following link: http://go.microsoft.com/?linkid=8101007
        /// </summary>
        #region IHttpHandler Members

        public bool IsReusable
        {
            // Return false in case your Managed Handler cannot be reused for another request.
            // Usually this would be false in case you have some state information preserved per request.
            get { return false; }
        }

        public void ProcessRequest(HttpContext context)
        {
            SyndicationFeed feed = GetFeed(FeedID);

            var datafeed = ParseFeed(feed);

            string data = Newtonsoft.Json.JsonConvert.SerializeObject(datafeed.Select(pe => new 
                {
                    Author = pe.Author,
                    CommentsCount = 0,
                    Description = pe.Description,
                    Content = pe.Content,
                    FeedId = pe.FeedId,
                    IsRead = pe.IsRead,
                    Link = pe.Link,
                    PubDate = pe.PubDate,
                    Stared = pe.Stared,
                    Title = pe.Title
                }), Newtonsoft.Json.Formatting.Indented);

            context.Response.ContentType = "application/json";
            context.Response.Write(data);
        }

        private List<Post> ParseFeed(SyndicationFeed feed)
        {
            List<Post> posts = new List<Post>();
            if (feed != null)
            {
                // Use the feed   
                foreach (SyndicationItem item in feed.Items)
                {
                    try
                    {
                        Post feedItem = new Post()
                        {
                            Author = string.Join(", ", item.Authors.Select(sp => sp.Name)),
                            CommentsCount = 0,
                            Description = item.Summary != null ? item.Summary.Text : "",
                            Content = GetContent(item),
                            FeedId = FeedID,
                            IsRead = false,
                            Link = item.Links.Select(l => l.Uri.AbsoluteUri).FirstOrDefault(),
                            PubDate = item.PublishDate.UtcDateTime,
                            Stared = false,
                            Title = item.Title.Text
                        };
                        posts.Add(feedItem);
                    }
                    catch (Exception exception)
                    {
                        HandleException(exception);
                    }
                }
            }
            return posts;
        }

        private string GetContent(SyndicationItem item)
        {
            string content = null;
            if (item.Content != null)
            {
                content = item.Content.ToString();
            }
            else if (item.ElementExtensions.Where(e => e.OuterNamespace == CONTENT_NAMESPACE && e.OuterName == "encoded").Count() > 0)
            {
                var elem = item.ElementExtensions.Where(e => e.OuterNamespace == CONTENT_NAMESPACE && e.OuterName == "encoded").FirstOrDefault();
                if (elem != null)
                {
                    content = elem.GetObject<string>();
                }
            }
            return content;
        }

        private SyndicationFeed GetFeed(int feedId)
        {
            try
            {
                using (WebClient client = new WebClient())
                {
                    var buffer = client.DownloadString(string.Format(MOBILESERVICE_ENDPOINT, feedId));
                    Feed feed = Newtonsoft.Json.JsonConvert.DeserializeObject<Feed>(buffer);
                    XDocument doc = XDocument.Load(feed.XmlUrl);
                    return SyndicationFeed.Load(doc.CreateReader());
                }
            }
            catch (Exception ex)
            {
                HandleException(ex);
            }
            return null;
        }

        private void HandleException(Exception ex)
        {
            Trace.WriteLine(ex.Message);
            Trace.WriteLine(ex.StackTrace);
        }

        public int GetFromQueryString(string requestParameter, int defaultValue)
        {
            string value = HttpContext.Current.Request.QueryString[requestParameter];

            if (!String.IsNullOrEmpty(value))
            {
                int iValue = -1;

                if (Int32.TryParse(value, out iValue))
                {
                    return iValue;
                }
            }

            return defaultValue;
        }
        #endregion
    }
}

As you can see the handler uses the REST API of my Mobile Service to access the Feed, there’s no secret key involved or any MobileService managed API call, I’m just calling https://cloudreader.azure-mobile.net/tables/feed/{0} passing the ID and I’m getting back a Json serialized Feed Object. In order for this to work I had to change the security permissions on the Feed table and allow Read access to everyone.

image

Once I deployed and tested my proxy cloud service I revisited my Windows Azure Mobile Service scheduled job to complete it. Now the scheduled job calls the proxy for each Feed to convert the RSS feed to a json one and then if the post is not already in the database inserts it. The final code looks like this:

function importPosts(feed) { 
    var req = require(''request''); 
    console.log("Requesting item ''%j''.", feed); 
    req.get({ 
            url: "http://cloudrssreader.cloudapp.net/xmltojson.ashx?id=" + feed.id, 
            headers: {accept : ''application/json''} 
        }, 
        function(error, result, body) { 
            var json = JSON.parse(body); 
            json.forEach(function(post) { 
                var postsTable = tables.getTable("post");
                postsTable.where({Link : post.Link}).read({success : function(existingPosts){
                    if (existingPosts.length === 0){
                        console.log("Inserting item ''%j''.", post); 
                        postsTable.insert(post);
                    }
                }});
            }); 
        } 
    ); 
}

Once I had my scheduled job enabled, I had new posts being stored in my database every hour ready to be presented in my clients.

Stay tuned to read, in part 4 of this series how I’ve consumed my posts in the Windows 8 client I’ve talked about in part 1.


As you may have already read, a couple of months back I migrated my blog to Windows Azure WebSites and changed my blog engine to WordPress. The WordPress gallery template comes with a MySQL database 20MB free offer from ClearDB which was more than I needed (or so I thought) so I went alone and used it.

Lately though I”ve been experiencing a strange behavior, at undeterminable intervals my changes stopped being persisted at the database. After getting in contact with the ClearDB guys, I learned that the problem was that the MySQL engine was generating temp tables in order to facilitate the queries being made to it and the size of the data in these tables was greater from my quota leading to a block. The conversation reached a point where the ClearDB guy said

…I”m sorry for the inconvenience. If you”re not able to optimize your queries, and I gather that you”re using a stock WordPress install, the solution is simply to upgrade to the next service tier… (You can read the full transcript here)

I got so frustrated by this that I decided to investigate if moving to SQL Azure was an option and guess what… you can easily use SQL Azure with your WordPress Azure WebSite here”s how.

webmatrix

Step 1

Create a new WordPress site using WebMatrix localy by clicking new and selecting App Gallery from the menu.

Step 2

Download WP Db Abstraction. This plugin provides db access abstraction and SQL dialect abstraction for SQL Server. It is an mu (Must-Use) plugin AND also a db.php drop-in.

Step 3

Put the wp-db-abstraction.php and the wp-db-abstraction directory to wp-content/mu-plugins.  This should be parallel to your regular plugins directory.  If the mu-plugins directory does not exist, you must create it. Put the db.php file from inside the wp-db-abstraction.php directory to wp-content/db.php. Rename the wp-config.php file in the root folder to wp-config_old.php since this is going to be written by the setup wizard at a later step.

Step 4

Publish your newly created WordPress site to an empty Azure WebSite slot.

Step 5

Create a new SQL Azure database

Step 6

Once the site is published visit http://your_wordpress_url/wp-content/mu-plugins/wp-db-abstraction/setup-config.php to generate your wp-config.php file and follow the instructions to connect WordPress with the database you created in the previous step. In the database type field you’ll have to select BDO_SQL as the type to connect to SQL Azure. Note that WordPress should be able to connect to the database otherwise you”ll get an error.

Step 7

Complete the normal WordPress installation

Step 8

There is a known bug in the  WP Db Abstraction that causes your posts and media files to not appear in their corresponding lists in the admin. You can read more about it here. So you’2015-04-28 07:02:14’ll have to edit the function is translate_limit in the wp-db-abstraction/translations/sqlsrv/translations.php file

// Check for true offset
if ( (count($limit_matches) == 5  )  && $limit_matches[1] != ''0'' ) {
    $true_offset = true;
} elseif ( (count($limit_matches) == 5 )  && $limit_matches[1] == ''0'' ) {
    $limit_matches[1] = $limit_matches[4];
}

The “$limit_matches” returns 6 items and in a validation is asking for 5 that”s all.

// Check for true offset
if ( (count($limit_matches) == 5  )  && $limit_matches[1] != ''0'' ) {
    $true_offset = true;
} elseif ( (count($limit_matches) >= 5 )  && $limit_matches[1] == ''0'' ) {
    $limit_matches[1] = $limit_matches[4];
}

just change the == for a >= and it works like a charm.

If you already had a WordPress installation like I did then you can easily migrate your data to your new SQL Azure backed installation. You just export all your data from you old site and import them to the new one.

Now, with half the price (4.99$ per month) of what I would have had to pay ClearDB to upgrade to the next service level I”ve got all the space I”ll ever going to need (up to 100MB) and all available for my data not temp.


In Part 1 of this series I’ve shown you how easy it was to build my Google Reader backing service using Windows Azure Mobile Services. In this part I’ll show you how I put that service to work using a Windows 8 client application and how I managed my data with it.

The client application I had been using with Google Reader so far was Modern Reader and I must admit that I got quite fond and used to it. So I’ve decided to build something similar, a Windows 8 application that would connect and sync with my newly created service. So I fired up Visual Studio and using the new Widows 8 application wizard I’ve created my client.

Now, from the Windows Azure Mobile Service tab you can get detailed instructions on how to connect your existing applications to the service or even download an already hooked to the service project for all the available platforms.

instructionsSo the first thing I did once my application had been created was to connect it with my mobile service. So I first imported the Windows Azure Mobile Services nuget package to get the necessary assemblies to my project (instead of installing the sdk) and then added a static member to my App class to allow easy access to my service as the guide suggests.

public static MobileServiceClient MobileService = new MobileServiceClient(
    "https://cloudreader.azure-mobile.net/", 
    "YOUR_MOBILE-SERVICE-KEY");

Having hooked my app with my CloudReader mobile service, I needed a way to manage (add/remove) and import my feeds. As mentioned earlier in my previous post I already had downloaded my data from Google Reader, so I had an OPML formatted xml file that contained all my subscriptions, so I needed some UI to help me pick that file, parse it and save my subscription data to my service. I thought that the best place for this UI was probably a custom fly out spawned by the settings charm bar like the Clasic RSS app does. So using the new Page wizard from visual studio I’ve created a Settings Flyout page and added a few controls to help my pick up the OPML file (DISCLAIMER I’m not a designer so the UI I’ve created is probably not the best you’ve seen but it gets the job done. We’ll talk about design more extensively in the last post of this series). The resulting UI looked something like that:

SettingsFlyout

In order to save some space in this post I will post the full xaml source code of this page as well as any other page of my solution at the end. To hook it up with the settings charm all I had to do was to handle the CommandsRequested event for the current view immediately after the window activation.

SettingsPane.GetForCurrentView().CommandsRequested += App_CommandsRequested;

Then on the event handler I instanciated a new UICommandInvokedHandler and provided a IUICommand to be called when this handler was selected.

void App_CommandsRequested(SettingsPane sender, SettingsPaneCommandsRequestedEventArgs args)
{
    UICommandInvokedHandler handler = new UICommandInvokedHandler(OnSettingsCommand);

    SettingsCommand generalCommand = new SettingsCommand("FeedsId", "Feeds", handler);
    args.Request.ApplicationCommands.Add(generalCommand); 
}

The OnSettingsCommand, in turn, created a new popup at runtime set its child control to be the settings flyout page I had already created and then opened it using some transitions.

void OnSettingsCommand(IUICommand command)
{
    // Create a Popup window which will contain our flyout. 
    settingsPopup = new Popup();
    settingsPopup.Closed += settingsPopup_Closed;
    Window.Current.Activated += Current_Activated;
    settingsPopup.Width = settingsWidth;
    settingsPopup.Height = Window.Current.Bounds.Height;

    // Add the proper animation for the panel. 
    settingsPopup.ChildTransitions = new TransitionCollection();
    settingsPopup.ChildTransitions.Add(new PaneThemeTransition()
    {
        Edge = (SettingsPane.Edge == SettingsEdgeLocation.Right) ?
               EdgeTransitionLocation.Right :
               EdgeTransitionLocation.Left
    });

    // Create a SettingsFlyout the same dimenssions as the Popup. 
    SettingsFlyout mypane = new SettingsFlyout();
    mypane.Width = settingsWidth;
    mypane.Height = Window.Current.Bounds.Height;

    // Place the SettingsFlyout inside our Popup window. 
    settingsPopup.Child = mypane;

    // Let's define the location of our Popup. 
    settingsPopup.SetValue(Canvas.LeftProperty, SettingsPane.Edge == SettingsEdgeLocation.Right ? 
        (Window.Current.Bounds.Width - settingsWidth) : 0);
    settingsPopup.SetValue(Canvas.TopProperty, 0);
    settingsPopup.IsOpen = true; 
}
Now all I had to do to complete my feed management / importing was to actually write the code for it in my settings flyout page. Since I didn’t want my code to be coupled with the page though I’ve decided to use MVVM. So I used nuget again to import MVVM light and created a ViewModel for my settings flyout page.
The ViewModel code was pretty straight forward, I used 2 commands that were triggered by the buttons on my UI to call the ImportFeeds method which in turn handled all the business logic.
public class SettingsViewModel : NavigationViewModel
 {

     public SettingsViewModel()
     {
         ////if (IsInDesignMode)
         ////{
         ////    // Code runs in Blend --> create design time data.
         ////}
         ////else
         ////{
         ////    // Code runs "for real"
         ////}
     }

     private StorageFile _selectedFile;
     public StorageFile SelectedFile
     {
         get { return _selectedFile; }
         set
         {
             _selectedFile = value;
             RaisePropertyChanged("SelectedFile");
         }
     }

     private RelayCommand _OpmlSelectCommand;
     public RelayCommand OpmlSelectCommand
     {
         get
         {
             return _OpmlSelectCommand
                 ?? (_OpmlSelectCommand = new RelayCommand(async () =>
                     {
                         var filePicker = new FileOpenPicker();
                         filePicker.FileTypeFilter.Add(".xml");
                         filePicker.ViewMode = PickerViewMode.List;
                         filePicker.SuggestedStartLocation = PickerLocationId.Downloads;
                         filePicker.SettingsIdentifier = "OPML Picker";
                         filePicker.CommitButtonText = "Select File";

                         SelectedFile = await filePicker.PickSingleFileAsync();
                     }));
         }
     }

     private RelayCommand _ImportFeedsCommand;
     public RelayCommand ImportFeedsCommand
     {
         get
         {
             return _ImportFeedsCommand
                 ?? (_ImportFeedsCommand = new RelayCommand(async () =>
                     {
                         if (SelectedFile != null)
                         {
                             var stream = await FileIO.ReadTextAsync(SelectedFile);
                             var opml = XDocument.Parse(stream);
                             if (opml != null)
                             {
                                 ImportFeeds(opml);
                             }
                         }
                     }));
         }
     }

     private async void ImportFeeds(XDocument opml)
     {
         foreach (var item in opml.Descendants("outline").Where(el => el.Attribute("xmlUrl") == null))
         {
             var feedGroupTable = App.MobileService.GetTable<FeedGroup>();
             var feedFolder = new FeedGroup()
                 {
                     Text = item.Attribute("text").Value,
                     Title = item.Attribute("title").Value
                 };
             await feedGroupTable.InsertAsync(feedFolder);

             foreach (var fds in item.Descendants("outline").Where(el => el.Attribute("xmlUrl") != null))
             {
                 var feedTable = App.MobileService.GetTable<Feed>();
                 var newFeed = new Feed()
                 {
                     FeedGroupId = feedFolder.Id,
                     FeedType = 1,
                     HtmlUrl = fds.Attribute("htmlUrl").Value,
                     Text = fds.Attribute("text").Value,
                     Title = fds.Attribute("title").Value,
                     XmlUrl = fds.Attribute("xmlUrl").Value
                 };
                 await feedTable.InsertAsync(newFeed);
             }

         }
     }

 }
The only interesting part in this piece of code is how easy it was to actually save my subscriptions to my database through my Windows Azure Mobile Service. All it took was two lines of code first I had to get a reference to my table using the call
var feedGroupTable = App.MobileService.GetTable<FeedGroup>();
and then I called
await feedTable.InsertAsync(newFeed);
for every feed in my subscriptions OPML formatted xml file.Now that my feeds were safely stored in my database I had to find a way to download news for each feed.
Stay tuned to read, in part 3 of this series how I wrote my server side code to read news for my stored subscriptions.

I’ve been using RSS feed aggregators most of my life and have tried pretty much all available versions / flavors out there. I’ve started by using stand alone RSS aggregators with my favorite being feeddaemon but a couple of years back I decided to switch to Google Reader as I was tired of reading again and again the same stories every time I used a different device. Using Google Reader I was able to keep all my devices in sync and pick up where I left off whichever device I used. Plus there were a number of very cool and fast native windows 8/ windows mobile apps available for all my devices, that used Google Reader API. So I was able to store my feeds centrally but work with a native app locally on my laptop, desktop, tablet and phone. So you can understand my frustration when I learned that Google decided to retire the service.

Since then I’ve been trying to find an alternative, I visited most of the “The best Google reader alternatives” stories and tried most of the suggestions but I couldn’t really find what I was looking for. But most importantly, none of the alternatives could convince me that my data were safe and that what happened with Google wouldn’t happen with them as well. And then it struck me, why don’t I build something my self? and I did. Using Windows Azure Mobile Services I was able to build an RSS aggregation service in absolutely no time that I can literally use from any device out there. So in this series will show you how you can take advantage of most of the Windows Azure Mobile Services features to build your own Google Reader alternative.

Creating a Windows Azure Mobile Service

So first thing I did was to create a new WAMS called CloudReader. There are plenty of resources available on how to start your first Windows Azure account and create your first Windows Azure Mobile Service so I won’t go into much detail on that part and instead go ahead assuming that you’re pretty familiar on how to do that.

Data ModelGTakeout

Then I had to design my service data model. To do that I had to think about the kind of data I would store. In my case I wanted to implement the KISS principle so I’ve decided to only have 3 different data types

  • FeedGroup. Represents a folder/container of feeds that helps better organize my feeds
  • Feed. Represents the feed I’m following
  • Post. The posts I’ve downloaded and read.

Fortunately Google Reader allows you to export your data through the Google Takeout service, so I was able to see all the data Google is storing and have a better idea on what data I would use in my service.

Finally the model I came up with, looked something like this:

image

Now, building the data model any other way I would have had to use some db initialization SQL script or use SSMS (SQL Server Management Studio) to design my data model in the database and I would then have had to migrate the database to SQL Azure using some tool like for example SQL Azure Migration Wizard. Then I would have had to build some data access mechanism using Linq or Entity Framework, build repositories, implement IoC, dependency injection and write a bunch of server side code just to access the database let alone expose it as a service to my clients.

WAMS_DataUsing WAMS all I had to do was create three tables by going to the DATA tab and clicking the create table button at the bottom. Just by following the simple wizard not only I was able to create an initial database schema but also create the data access layer and expose everything through, an automatically generated by WAMS, rest service.

To Create the rest of the schema I had taken advantage of the dynamic schema feature WAMS offers. When dynamic schema is enabled on your mobile service, new columns are created automatically when JSON objects are sent to the mobile service by an insert or update operation. So all I had to do was to create a new CloudReader.Model portable class library (we’ll discuss the project type choice in a next part) where I introduced 3 POCO classes that corresponded to each of my table and when I my application was ready the insert operations on these objects would create the necessary db schema.

   1:  namespace CloudReader.Model
   2:  {
   3:      public class FeedGroup
   4:      {
   5:          public int Id { get; set; }
   6:          public string Text { get; set; }
   7:          public string Title { get; set; }
   8:      }
   9:  
  10:      public class Feed
  11:      {
  12:          public int Id { get; set; }
  13:          public int? FeedGroupId { get; set; }
  14:          public string Text { get; set; }
  15:          public string Title { get; set; }
  16:          public int FeedType { get; set; }
  17:          public string XmlUrl { get; set; }
  18:          public string HtmlUrl { get; set; }
  19:      }
  20:  
  21:      public class Post
  22:      {
  23:          public int Id { get; set; }
  24:          public int FeedId { get; set; }
  25:          public string Title { get; set; }
  26:          public string Link { get; set; }
  27:          public DateTime PubDate { get; set; }
  28:          public string Author { get; set; }
  29:          public string Description { get; set; }
  30:          public string Content { get; set; }
  31:          public int CommentsCount { get; set; }
  32:          public bool Stared { get; set; }
  33:          public bool IsRead { get; set; }
  34:      }
  35:  }

In the next part of this series I’m going to show you how to start your Google Reader client application that is going to consume your newly created service. For this post series I’m going to use a Windows 8 application as my client but building a windows mobile 8 client app is equally easy.


TechEdEurope2013Great news arrived this week, I’ve been selected to participate as a product subject matter expert for Windows Azure at TechEd Europe which takes place in Madrid this year from the 25th till the 28th of June. So I’ll be available if you want to discuss about Windows Azure, web and windows 8 app development, or anything you’d like to know about Microsoft technologies in general.

Drop me a line if you’re planning to visit TechEd as well to get together and have a beer.