Tag Archives: MSDN

The Redesigned Visual Studio Developer Center

If you’re a frequent visitor to the Visual Studio Developer Center on MSDN, you’ll notice that we just switched things around a bit. Okay, quite a lot, actually!

We’ve fundamentally changed the information architecture of the site, implemented a regular page pattern for non-Library pages and significantly shrunk the number of non-Library pages on the site.

Information Architecture Changes

Over the past six months, my team and I have been looking closely at site metrics around page views, clicks on pages, survey and other site feedback, and other data available to us around how developers use the site. What emerged from that noise of data were some clear signals about how the site was being used and we’ve tried our best to respond with updated site navigation and content organization.

The most obvious change is in the site navigation:

The Old Visual Studio Developer Center Header Navigation

The New Visual Studio Developer Center Header Navigation

The Old Visual Studio Developer Center Footer Navigation

The New Visual Studio Developer Center Footer Navigation

The biggest change is moving from eight items to six and the addition of sub-navigation links.

What went away were top-level links to Support and Downloads.

The Support link was rarely clicked and the page itself did not receive that much traffic, so that was an easy choice. On the other hand, the Downloads navigation link was consistently the most-clicked link in the header navigation and surfaced as a top task on the site as a whole.

So why remove Downloads from the site navigation? It seems counter-intuitive to do so. In the old model we had a dedicated downloads page that aggregated links to most of the popular or key downloads and these links spanned Visual Studio and Team Foundation Server. Finding the download you were looking for involved skimming a few dozen links and if you didn’t find it, you were pretty much stuck performing a site search or going back to your favorite Internet search engine.

In the new model, downloads are now part of the page-level information architecture as a regular link block, appear on every page, are scoped to the topic of the page they appear on, and usually provide an “all downloads” link that takes you to the Microsoft Download Center. I’m very curious what you think of this change, since it’s a big one.

Other visible changes of note are changing Library to Documentation, Learn to Languages, and the addition of sub-navigation links.

Library, as a term, has specific connotations for developers in general and Microsoft developers in particular. To avoid potential confusion for non-Microsoft developers and in the interest of clarity, changing to Documentation more accurately reflects what you can expect to find behind that link.

When we looked at the data around Learn, it was clear that developers were looking for language-specific learning resources more than anything. In fact, wherever we had a language learning link, it often was the most-clicked link on the page. By elevating Languages to the top-level navigation, we’re hoping that when you come in to the site via search, which many of you do, you’re now only a click away from those language learning resources.

The reasons behind the addition of sub-navigation items were twofold: more clearly represent the lower-level structure of the reorganized site in the header and provide a sitemap like experience in the footer. Hopefully this will help with the, “Where the heck is ‘foo’?”

Under the hood and not so obviously, the navigation now reflects the underlying content information architecture that pivots around products, samples, languages, extensions, documentation and community. Each one of these “buckets” now only contains content aligned with it, whereas before we had all sorts of stuff spread across the entire site.

Regular Page Pattern

I’m a strong believer that regularized page patterns assist repeat site visitors and that placing things in standard spots reduces the cognitive burden of analyzing a page for the task you are trying to complete. Prior to today, we had pages that more or less followed a handful of page patterns, and several that were freeform designs. If you had picked five pages at random from the site and examined their functional layout, chances are none of them would have matched.

As much as possible, (there are always exceptions,) we have a single page pattern across the site:

The New Page Pattern

As much as possible, we have contextualized the links in the right-hand column to provide the most-requested downloads and link you to places that make the most sense when coming from this page.

I’m hoping that if you’re a frequent visitor of the site, you’ll be able to more quickly find what you’re looking for, because now things will always be in the same place on the page.

Page Reduction

How many web pages does a website need? It depends.

In the case of Visual Studio, we had around 323, give or take a handful. The traffic graph yielded a curve like this, with the long tail being very flat when you looked out to the end:

When I see something like this across hundreds of pages, I naturally ask, “What are those pages that aren’t getting much traffic and do we need them?” As we clicked through each page of the site, what we discovered was that most of them were very out of date or had been shimmed into the site to solve some short-term need and then forgotten. Others were important to internal, company stakeholders but uninteresting to customers.

After careful deliberation, we’ve removed and redirected hundreds of pages. The site is now about 80 pages, which we’re going to be continually working on driving down to under 50. The core site, which you directly access via the navigation and sub-navigation elements, is 11 pages.

By doing this, we’re hoping that your search experience just become a whole lot better. Fewer pages with completely re-written, focused content limited to a single topic should provide better search engine results than many diffuse pages that span multiple topics.

Please let me know what you think about this new experience in the comments, as I’d love to have more input as we consider the future evolution of the site.

My New Job

Visual Studio 2012 shipped last month, so the rhythm of the business shifted to reorganizing for the next product cycle. As a part of the reshuffling, my job scope expanded from overseeing the Visual Studio Developer Center to the MSDN proper and Team Foundation Service websites. My charter covers content presentation, information architecture and experience integration. That’s a fancy way of saying, “Help build a more useful and pleasant to visit website.”

MSDN, according to Alexa, receives about 11% of all web traffic directed to microsoft.com, so I’m really looking forward to helping shift the experience for the millions of customers that visit the site. If you’re one of those customers, feel free to let me know what you do and don’t like about MSDN, and I’ll incorporate that into the evolving plans.

My team has also grown and I’m very happy to be working with some of the same people that I worked with a few years back when I was Lead Site Manager for TechNet*.

I already have some ideas about some things I’d like to see change and I’ll be highlighting them here going forward; don’t be shy about telling me what does and doesn’t work. 😉

(*Having helped launch Microsoft Answers in 2008, I’ve now officially worked across the Microsoft technical website audience triad of Developer, IT Pro and Consumer. It’s amazing and humbling when I start to think of the millions of people that have interacted with my work. Very few places in the world have that scale opportunity and I feel fortunate for the opportunity.)

Visual Studio 11 Beta Launch

I was up at 04:45 this morning so I could be at work by 06:00 to help out with the Visual Studio 11 Beta launch. Check out Jason Zander’s post for a great rundown of what’s in the beta.

Those that know me know that I am not a morning person, so I’m happy I made it in to work in one piece. Parking was very easy, too.

Pick your spot...

My role was to oversee updates to the Visual Studio  Developer Center in English, French, German, Japanese and Simplified Chinese. We had an intentionally small footprint for the Beta on the Developer Center, so this was a fairly small release as far as these things go. As usual, I rely on a large team of folks to help make this stuff happen, and they did a great job of making it all come together.

Now I just need to make sure I don’t fall asleep before my last meeting of the day!

Data Wallowing

It’s been snowing in the Seattle area and I now live on a very steep hill, so today I worked from home and have been wallowing in website analytics between a couple of sledding breaks.

My data wallows look at everything the analytics reports can serve me: visitors, page views, countries, browser versions, page paths, etc. The more esoteric the data, the more I like it, actually. I tend to find the signals in the extremes: the most popular and least popular stuff. They tell you what to focus on and what to chuck overboard.

I look at the last month, quarter, half-year, and year to get a feel for the trends over time and see how major site updates impacted traffic. I look at the top stats for each bucket and also look deep in the long tail to see what’s hiding. I sanity-check the data against my expectations, like looking at the percentage of non-U.S. visitors to the U.S. site, (it’s always higher than I expect,) and referring pages, (the top entry is really bookmarks instead of search???)

I look at clickmaps of the most and least trafficked pages to get a sense of how the page layout may be influencing clickthroughs.

Then if I have access to it, (at Microsoft I do,) I look at the data of the referring sites themselves to see where the outbound rank is for the site I’m analyzing. I look for customer satisfaction data, customer feedback, planning and marketing data, as well as industry trends for the segment the site’s in.

I search social media and look for positive and negative things about the site in question. I also see what they’re saying about the competition.

Then I spent time thinking about instrumentation gaps and how I can triangulate across or re-query the data sets I do have access to in order to guesstimate the gaps. Examples of gaps that I’ve run into in the past are not instrumenting by content types or site sections.

When I have all this data loaded into my head and spreadsheets, I can finally begin analysis by creating an empirical top task list based on what the data says and compare that to the expected or desired top task list. Further analysis is a topic for another day.

Information Architecture Planning

I’m just starting a new website project at work, and one of the first things I’m attacking is the information architecture. I’ve learned the hard way that having a solid site architecture can save boatloads of redesign pain later on.

My method is to plan it out and it goes roughly like this:

  • Learn everything I can about the topic(s) to be presented – past history, present situation, and future plans
  • Discover who all the stakeholders are
  • Perform an existing site content type and page audit
  • Build a site map of the current site
  • Wallow deeply in site metrics for top pages, visitor trends, referrers, search terms, and page flows
  • Wallow deeply in customer data around segmentation, intent drivers, and key tasks
  • Look for stuff that can be dumped overboard
  • Figure out what will need to be added in the future
  • Whiteboard out all the elements (content, information flows, external process connections, customer segments, etc.)
  • Stare at whiteboard for hours, then erase and draw, erase and draw until a model and page pattern(s) appear
  • Wireframe a few pages with the designers to get a feel if the model works across architectural segments
  • Go back to whiteboard and fix the broken stuff
  • Wireframe again
  • Look for more stuff that can be dumped overboard
  • Build high-fidelity comps, preferably on the deployment platform
  • Lock it for usability/review
  • Tweak after usability/review (if needed)
  • Hand off to production when it’s complete

Easy, eh? 😉

 

New Year, New Job and Shingling

After two and a half years of a fascinating, challenging, and wild ride in the Windows Phone division, I am moving this coming week to the Developer Division where I will be Senior Program Manager in Visual Studio overseeing a project that spans the Visual Studio marketing and developer websites.

This will be the fourth large website project I’ve done for Microsoft, the first three were combining the Windows XP, Windows Vista, and Windows 7 TechCenters, launching Microsoft Answers, their first forum-based consumer customer support site, and launching App Hub, the Windows Phone developer website, which combined the XNA Creator’s Club and the previous Windows Phone developer site.

The common threads across all of these sites that I drove was re-factored information architectures, large-volume content presentation, and leading cross-divisional working teams. I’ve learned quite a bit over the past six years at Microsoft about these areas that I’ll certainly bring to bear in my new role.

On a different note, most recently, I’ve spent the past few months analyzing quite a bit of data around the API Reference portions of the MSDN Library and have been evangelizing my results to documentation teams across Microsoft. I’m cautiously optimistic that a couple of teams have taken the data to heart, as I know that some work has begun to address some of the larger pain points. When this work by many, many people eventually comes to fruition, it should dramatically increase MSDN Library search engine relevance, hopefully making problems like this and this much less severe, and make the treasure hunt for API information less frustrating.

In a nutshell, structural artifacts of the documentation process creates web pages that are similar in content. Search engines use a process called shingling to de-duplicate and willow results. In cases where they look across large, structured documentation sets, you may not ever get search results for specific pages because they look too similar to other pages. (Examples A, B, and C.) Mark Manasse of Microsoft Research was kind enough to give me some of his time in December to explain this in more detail and give me some great ideas to pursue to understand the scope of the problem and ways to solve it.

I wish I could say more now, but changing page patterns for millions of web pages takes time. I’ll keep you posted. 🙂