A nearshore team from Uruguay, South America (GMT-3) UruIT Blog

# UruIT Blog

A nearshore team working with Microsoft technologies.

### News

Technorati Tags: ,,

I have been using PowerPivot for quite some time now, and I it has helped me create some nice dynamic reports without the need to commit hours of specialized BI development.

Even though working with this tool is pretty much intuitive, there are some DAX expressions that require extra investigation. (DAX is Data Analysis Expressions). The EARLIER() function was such an expression..

The need presented itself while I was working on my latest report and it seemed something pretty straight forward. Nevertheless I had to try out several things without luck, until I found EARLIER().

Here's what I wanted to do:

I had 2 tables: HHRR Costs, with all the hours put in each project, down to the month, and table Project Income, which had each project income down to the month as well.

The relationship was created over a calculated column on both tables by concatenating year, month, and project name. This is a common procedure to define 1-to-1 or 1-to-many relationships on tables that don’t have unique values in a single column. So I created a calculated column [Year_Month_Project] = [Year]&[Month]&[Project] on each table and connected them.

Here comes the tricky part...

Now I needed to perform some calculations in the HHRR Costs table, and for that I needed to know how much was the total income of each project, referenced in each row of the HHRR Costs table :). For example, something like this:

(\$10,000 was the total income of the project over its 3 month duration.)

First thing I tried was:

[Total Project Income] = RELATED('Project Income'[Project Income]).

But this gave me the project income for each month, because thats how the relationship was set (at the month level), so I would get something like this:

Then I went on trying several combinations with the SUM, FILTER and SUMMARIZE functions without any luck. I needed to tell PowerPivot that for each of the rows in this table (HHRR Costs), go to the Project Income table, and add up all the values in the Income column but only for the related project (current row in HHRR Costs).

So I tried something like this:

[Total Project Income]=SUMX(FILTER('Project Income','project income'[project]=[Project]), 'Project Income'[Income])

Close, but not quite it, because the context of [Project] was not the current project... So instead of getting the total project income, I got the sum of all the projects in every row.

The solution came to me when I discovered the function EARLIER(), and here is how you should do it:

[Total Project Income]=SUMX(FILTER('Project Income','project income'[project]=EARLIER([Project])), 'Project Income'[Income])

For a clear explanation of EARLIER click here. Basically EARLIER will lock the current Project value through the SUMX iteration. Thus it will perform exactly what I needed:

For each row in HHRR Costs>

1. Filter Project Income table by the current project name
2. SUM [Income] column of the group of previously filtered rows

Also, here is a good one about SUMX

Thanks

Posted: Jan 22 2013, 11:40 AM by uruit
Filed under: , ,
Technorati Tags: ,

SharePoint offers the possibility to connect to content in a Document Library directly from Outlook, edit the documents offline and then sync when connection is restored. This is very useful if we are working at home and we want to access a shared document (ex. VPN connection settings) or continue working directly on a file.

Steps to configure the connection:

1. Browse online to SharePoint Document library you want to connect and click on "Connect to Outlook":

(click to enlarge)

2. Click Allow to confirm:

3. In Outlook you will see the documents as outlook email items with the ability to preview them. When a document is updated, Outlook will notify you that you have items unread. If you want to edit a file, the corresponding office tool (Word, Excel, PowerPoint) will ask if you want to update the server after saving a change, it is really straightforward.

(click to enlarge)

4. Finally, I recommend to add the IP address of your SharePoint server in the secure sites in order to prevent Outlook to ask for your windows credentials every time you open Outlook:

(click to enlarge)

Outlook is a great tool, letting you work in a really integrated way, don't miss this amazing feature.

This feature is also available in SharePoint Online :)

Post by: Marcelo Martinez

Leaders in Nearshore Outsourcing from South America

We had a truly exciting past week in UruIT due to the release of CRMGamified (beta) by UruIT Dynamix, a branch of UruIT. Excited not only because of this new line of business that’s emerging, but because the product itself is of great importance to UruIT as a whole. It comes to solve one of the most common challenges in every CRM system implementation: User adoption and compliance.

In UruIT we implemented CRM sometime ago initially on-premise and now on the cloud. And it came to be a very important step in many of the processes that we follow in the company. Besides all the incredible functionality available for our sales and marketing departments, operations, administration and finance departments depend heavily on this tool. In there we keep track of every sale that is closed, with all the details needed to effectively plan and size projects, invoice clients, and build cash flow reports. As you can see, the information stored in the tool needs to be as accurate and complete as possible.

How to trust the information that is presented to us by our systems? I am not only talking about Dynamics CRM here, but in general. How do you know the information that is delivered to managers, is accurate and complete? There is no easy solution to this, and will depend on each particular scenario, but in our case (and for many clients as well) we found that implementing gamification concepts to our tools improves the accuracy and completeness of our data.

Instead of going on a witch hunt asking people to fill in every field of every screen, etc, we go the other way round by rewarding those who do.  So, CRMGamified, is a complete gamification solution for MS Dynamics CRM/xCRM. The tool is an add-on to Dynamics CRM, and is highly customizable to allow managers control how the reward system behaves in accordance with each particular business scenario.

So now people use the system and participate (if configured) in this sort of healthy competition to have the most up-to-date information entered, and in case of sales, to have the best performance among their peers. This has increased the completeness of our data, as well as it kept the teams motivated throughout their work day. It has also helped our managers monitor activities through a set of cool reports, and shape user behavior. Bear in mind that a well thought compensation plan goes a long way in tandem with this solution.

Here are some screenshots of the tool, but I recommend going to the website to see all the available features and try the online demo, or download the beta to fully test it on your environment.

Posted by: Iang Yim

UruIT (www.uruit.com)

#### Knockout: introduction

Knockout is a Model-View-ViewModel framework for Javascript. In this post I will explain the MVVM pattern, the concept of data-binding, and will give some examples of usage. At the end of the article you will find a sample file with all the code.

#### Data-oriented vs Control-oriented programming

Whenever I create a new web project, the same question arises. Do I split the files to keep the HTML separated from the Javascript?

As Dan Wahlin explains here, if we separate the code then, using the traditional control-oriented approach the code will look similar to this:

```\$("#txtName").val("…");

\$("#txtAge").val(28);```

The files are coupled. The Javascript file needs to reference the HTML controls by using their ID. It is not a bad solution, it works.

But if I add, remove or rename any control, then I have to remember to change the JS file as well.

An alternative to this is to use a data-oriented approach. The main difference is that the former needs to know how to be linked to the data, while the latter just needs to know what data to be linked to.

This can be achieved using a data-binding framework, such as Knockout. It means that you can link a control to an object, to get or set its value. For Javascript you will need a framework, such as Knockout.

#### MVVM

Model-View-ViewModel is an architectural pattern, used in Knockout, which “… facilitates a clear separation of the development of the graphical user interface (either as markup language or GUI code) from the development of the business logic or back end logic known as the model.” (Source: Wikipedia.org)

The Model represents the domain data. It has no behavior, except maybe for some data validation. All the business logic should be in the View Model. Finally, the View is the layer where the data is displayed, and it only interacts, as you may expect, with the View Model.

In a traditional scenario you will consume the data from a service (which interact with the model), and show it in a web page (the view). You must define the View Model in your Javascript code. See the example in the following section.

#### Knockout data binding

For getting started, let’s create a simple ViewModel object, with the company information.

```function CompanyViewModel() {
this.name = "UruIT";
this.country = "Uruguay";
}```

And our view will be an HTML two textboxes, and one label to display the name.

```<label>Company name:</label><input data-bind="value: name" />

<label>Country:</label><input data-bind="value: country" />

<label data-bind="text: name"></label>```

Look at the data-bind attributes in the HTML elements. They specify that the inputs values will be the name and country fields of our view-model object. The last label has its text bound to the name as well.

But in order to get those values, Knockout has to be activated, with an instance of our view-model object:

`ko.applyBindings(new CompanyViewModel());`

In this example, if you change the input values the underlying ViewModel won’t change. This is because we haven’t configured the dependency tracking yet. So, we have to change the way the ViewModel’s fields are defined, like this:

```function CompanyViewModel() {
this.name = ko.observable("UruIT");
this.country = ko.observable("Uruguay");
}```

The observable properties are capable of notifying the UI that its value has changed. If you run the example now, and change the company name, you will see that the label is automatically updated.

#### Computed values

Knockout supports the definition of fields that depend on others. For example if we want to have a field with both, the company name and country, we would add a new property in the view-model as shown here:

```this.companyInfo = ko.computed(function() {
return this.name() + " / " + this.country();
}, this);```

We can bind this field to any control, as we did with the other properties and elements. Note that as the name and country are observables, they must be treated as functions.

#### Other bindings

We have seen how to bind the value and text properties of some HTML controls. But Knockout is not limited to that.

You could bind the click event of a button to a function defined in the view-model, or bind the href property of a <a>, or the src of an image and more. Here are some of the possible bindings we could make:

##### Visible

If bound to a false value, it sets the display style to none. Otherwise the display style is removed, making the element visible.

In the following example, the element will be visible only if the result of the function is greater than zero:

`<div data-bind="visible: testFunction() > 0 }" />`
##### Css

With this binding, we can add or remove a css class of an element, depending on a condition.

In this example the myCssClass is assigned to the div, only if the result of the testFunction is less than zero.

`<div data-bind="css:{ myCssClass: testFunction() < 0 }" />`
##### Attr

This binding allows you to set any element property.

For instance, we can set the href of an anchor tag, or the src of an image:

```<a data-bind="attr: { href: url }">Link</a>

<img data-bind="attr: { src:imagePath }"/>```

#### Control flow: foreach

In cases you have to display several items, using the same template, you can use a foreach data binding. This way all the items of an array will be rendered.

In this example, there is an array in our ViewModel object. We want to display them in an unordered list in HTML.

```function TechnologiesViewModel() {
this.languages = [
{ name:"C#" },
{ name: "C++" },
{ name: "Java" }
];
}```
```<ul data-bind="foreach:languages">
<li data-bind="text:name"></li>
</ul>```

The “languages” array, contains three objects which only have one property: name.

A new list item will be created for each element in the languages array. In case you have a string array, you will have to bind each element to the \$data variable, which will contain the current string.

If you are interested in learning more about Knockout, I suggest you to check the documentation.

UruIT (www.uruit.com)

Before you run the installer make sure you’re:

· Installing the router on a Windows 7 32-bit (you’d need the 32-bit E-mail Router version in this case) or 64-bit system, or Windows Server 2008 64-bit system

· Logged in as a Domain User and have Administrator privileges

INSTALLATION

1) Run the SetupEmailRouter.exe. You’ll be prompted to choose between getting the latest updates for Dynamics CRM or install it as it is. Select “Get Updates” and click “Next”.

3) You’re now asked to select the components you’d like to install: Dynamics CRM Router Service and Rule Deployment Wizard (the last one gives you the chance to apply mailbox forwarding rules, so it’s recommended to select it as well).

4) Specify the destination folder where the router will be installed.

5) The system will check that everything is correct and ready for installation to continue. (Note that if you’ve selected the Rule Deployment Wizard component in step 3 you need to have the Microsoft Exchange Server Messaging API installed, otherwise the system won’t let you go forward)

6) A review is displayed with the options you selected. Click “Install” to finish the installation and proceed to configure it.

CONFIGURATION

1) Start the E-mail Router Configuration Manager. First thing you need to do is to create a new Incoming Profile in Configuration Profiles tab.

Profile Name: Anything you like and can identify later.

Direction: Select Incoming.

Email Server Type: The Exchange Server type your organization is currently using.

Protocol and Authentication Type: The protocol and authentication type your Exchange Server uses. In this case (Exchange Online) there are no further options to choose from.

Location: The Exchange Online web server URL of your organization.

Access credentials: Select “Other Specified” and enter the credentials of an Exchange Server active account.

(Note: Unless you intend to use a different port other than the default one, the “Advanced” tab doesn’t need any changes or configuration)

2) Now follow the same procedure to create an Outgoing Profile with some minor differences:

Direction: Select Outgoing.

Access Credentials: Select Administrator for user type.

Access Type: If you select “Delegate Access” e-mails will be sent displaying “Sent on behalf of (CRM User)” as the sender. If you chose “Send as permission” emails will be presented as if sent by the CRM user itself.

(Note: Remember to always publish your changes before going any further)

3) In “Deployments” tab create a new deployment.

Deployment: Microsoft Dynamics CRM Online.

Dynamics CRM Server: Replace <Organization Name> with your organization’s unique name (can be found in Settings\Customizations\Developer Resources).

Access Credentials: Select “Other Specified” and enter the credentials of a valid CRM user.

Default Configuration Profiles: Select the Incoming and Outgoing profiles you created before.

4) In the last tab, “Users, Queues and Mailboxes”, select the appropriate deployment (in case you’ve created more than one) and click “Load data”. It’ll display the users enabled to use the E-mail Router (you must enable first this option for each user in your organization or they won’t have access to it)

5) Finally click on “Test Access” to be sure everything works correctly. You should get “Succeeded” confirmation.

Nico V.

UruIT Dynamix www.uruitdynamix.com

In this post I will show you how to develop a plug-in to assign records related to an entity over a 1:N relationship.

Let’s put some background. Now imagine we have a child entity “Child” related to a parent entity “Parent” (N:1). When we assign “Parent” to another user, we also want that all the related records of type “Child” to be assigned to that user. You can do so by configuring the relationship behavior to “Parental” for the relationship between “Parent” and “Child”.

Ok, so far, so good, but what if “Child” also has a “Parental” relationship with another entity? We just can’t do it. CRM only allows one parental relationship between two entities (Out-of-the-box Dynamics CRM comes with several relationships of type “Parental” for the same entity, but with custom ones, it’s not possible).

So, the only way to carry out this task is through a plug-in, registered for the “Assign” message on the parent entity.

This is the code for it:

```public void Execute(IServiceProvider serviceProvider)
{
//Grab the plugin execution context
IPluginExecutionContext Context = (IPluginExecutionContext)serviceProvider.GetService(typeof(IPluginExecutionContext));

//Grab the service factory
IOrganizationServiceFactory factory = (IOrganizationServiceFactory)serviceProvider.GetService(typeof(IOrganizationServiceFactory));

//Create the organization service
IOrganizationService Service = factory.CreateOrganizationService(Context.UserId);

if (Context.InputParameters.Contains("Assignee"))    //Parameter of the Assign request
{
EntityReference assignee = (EntityReference)Context.InputParameters["Assignee"];
//Data Context from the crmsvcutil.exe generated class with option //serviceContextName:XRMDataService
using (XRMDataService DataContext = new XRMDataService(Service))
{
Guid ParentEntityId = Context.PrimaryEntityId;
new_parent Parent = DataContext.new_parentSet.Where
(c => c.new_parentId == ParentEntityId).FirstOrDefault();

if (Parent != null)
{
//Load the related entity collection for the relationship

if (Parent. new_new_parent_new_child != null)
{
try
{
List<new_child> AssociatedChilds = DataContext.new_childSet.Where
(a => a.new_parentId == Parent.ToEntityReference()).ToList();
if (AssociatedChilds != null)
{
foreach (Child Childs in AssociatedChilds)
{
AssignRequest assign = new AssignRequest
{
// User or team assigned to the child records
Assignee = assignee,
//Target child record
Target = Childs.ToEntityReference()
};
// Execute the Request
Service.Execute(assign);
}
}
}
catch (Exception ex)
{
throw new InvalidPluginExecutionException(
string.Format("An error occurred in the plug-in." + ex.ToString());
}
}
}
} //End of using DataContext
}
}

```

That’s all by now. Hope it helps.

Regards.

Nicolás Brandl [@nicolasbrandl]

Dynamics CRM Specialist

UruIT – Dynamix (http://www.uruitdynamix.com/)

Some days ago, I faced an issue regarding CRM audit feature. A customer was unable to close an active case since there was an open activity related to it. Then I told him to close all open activities before closing a case because that`s the way it works. However, he was unable to close the activity because CRM had experienced an error.

The activity that my customer was trying to close was a draft email (in fact, he wanted to delete the email, but he was unable to, though). When I took a look at it, I found that some recipients (highlighted in red) were not resolved in CRM.The exception was related to the Audit feature enabled by me a couple of days before the user had this error. The plug-in that registers the changes on the entities tried to establish a relationship between the recipients of the e-mail (fields To, CC and BCC) and some existent contact or account in CRM, which in fact, they don't exist in CRM.Just by looking at the server log, you can confirm this:

“The Web Service plug-in failed in OrganizationId: 0d469757-a137-46d1-8f38-12a6d65f79a5; SdkMessageProcessingStepId: b92673ed-dc92-442b-a6c6-82f2fce14585; EntityName: email; Stage: 25; MessageName: Delete; AssemblyName: Microsoft.Crm.AuditMonikerMessagesPlugin, Microsoft.Crm.Audit, Version=5.0.0.0, Culture=neutral, PublicKeyToken=31bf3856ad364e35; ClassName: Microsoft.Crm.AuditMonikerMessagesPlugin; Exception: Unhandled Exception: System.Collections.Generic.KeyNotFoundException: The given key was not present in the dictionary.
at System.Collections.Generic.Dictionary`2.get_Item(TKey key)
at Microsoft.Crm.AuditHelper.GetXrmValue(Object attribute)
at Microsoft.Crm.AuditMonikerMessagesPlugin.Execute(IServiceProvider serviceProvider)
at Microsoft.Crm.Extensibility.V5PluginProxyStep.ExecuteInternal(PipelineExecutionContext context)
at Microsoft.Crm.Extensibility.VersionedPluginProxyStepBase.Execute(PipelineExecutionContext context)

My workaround to this problem was to disable the audit feature for the fields To, CC and BCC of the e-mail entity.

I hope this helps.

Nicolás Brandl

When installing SharePoint solutions in production environments, configuration often leads to assorted complications. Making installation manuals for non-technical customers, adding configuration parameters to the web.config file, or adding any necessary configuration of third-party components by hand, are things that can be avoided by managing the configuration matters within the solution to be delivered without requiring any manual configuration, all through SharePoint API.

This can be accomplished with a feature with farm scope, which is responsible for changing settings in the web.config. This feature should have FeatureActivated and FeatureDeactivating event receivers in order to add / remove configuration values ​​to the web.config all through SharePoint API.

Then we should create a feature dependency between the configuration feature and the rest of the features in the solution, so that they cannot be activated before activating the farm scoped feature which is the one that provides the necessary configuration (adding it to the web.config) for your solution to work.

```public override void FeatureActivated(SPFeatureReceiverProperties properties)
{
SPSecurity.RunWithElevatedPrivileges(() =>
{
string owner = this.GetType().FullName;

SPWebConfigModification chartImageAppSetting = new SPWebConfigModification();
//XPath to parent element in which i want to add the child node
chartImageAppSetting.Path = "configuration/appSettings";
//XPath that identifies the element I am adding so it can be correctly updated or deleted afterwards
chartImageAppSetting.Sequence = 0;
chartImageAppSetting.Owner = owner;
chartImageAppSetting.Type = SPWebConfigModification.SPWebConfigModificationType.EnsureChildNode;
//XML node to add in the web.config
chartImageAppSetting.Value = "<add key='ChartImageHandler' value='storage=memory;deleteAfterServicing=true;' />";

SPWebConfigModification chartImageSystemWebHttpHandler = new SPWebConfigModification();
chartImageSystemWebHttpHandler.Path = "configuration/system.web/httpHandlers";
chartImageSystemWebHttpHandler.Sequence = 0;
chartImageSystemWebHttpHandler.Owner = owner;
chartImageSystemWebHttpHandler.Type = SPWebConfigModification.SPWebConfigModificationType.EnsureChildNode;
type='System.Web.UI.DataVisualization.Charting.ChartHttpHandler,
System.Web.DataVisualization,
Version=3.5.0.0,
Culture=neutral,
validate='false' />";

//Add the SPWebConfigModifications to the ContentService
SPWebService service = SPWebService.ContentService;
service.Update();
//All the modifications are applied to the web.configs in the farm
service.Farm.Services.GetValue<SPWebService>().ApplyWebConfigModifications();
});
}```

Federico Rodriguez

Posted: Oct 03 2011, 09:23 AM by uruit
Filed under: ,

### Hi! In this post we will see how to consume oData from Android. oData is an open web protocol for querying and updating data that has become very popular lately due to it’s simplicity and availability of tools and libraries.

1. To consume oData from an Android application we will use a library called “Restlet”. It can be downloaded from here: http://www.restlet.org.
Note: We will use the JRE Restlet libraries and the Android Restlet libraries so you will have to download both.

2. Once you downloaded both libraries the first step is to create a Java program to auto-generate the entities and the proxy service that will be used to consume oData from our Android application. The Restlet libraries can automatically generate the entity classes to access the oData service but that functionality isn’t available in the Android Restlet libraries, so we first need to generate a Java project and use the JSE Restlet libraries to generate the entities.

3. So, create a new Java project within Eclipse and add the external JSE jars.

This is done by right clicking on the project -> Build Path -> Configure Build Path. Click on Add External JARS.. and go to the folder where Restlet was installed (by default C:\Program Files (x86)\Restlet Framework\Edition Java SE\2.0.4\lib). Add the following libraries:

• org.restlet.jar
• org.restlet.ext.odata.jar
• org.restlet.ext.freemarker.jar
• org.restlet.ext.atom.jar
• org.restlet.ext.xml.jar
• org.freemarker_2.3\org.freemarker.jar

4. Then create a Main class and add the following code:

`Generator.main(new String[] {  "[service url]",  "[folder path]"});`
• [service url] – Url of the oData service
• [folder path] – Folder path where the Generator will copy the auto-generated classes

5. Run the project and go to the [folder path], you will see a class in the root of the folder (this is the proxy service) and a folder in the root with all the entities of the service within.

6. After that we have the proxy and the entities ready to be used from our Android App.

7. Import classes to our Android Project: Right click on the src folder, Import->File System and select the [folder path].

8. How to use the auto-generated classes?

a. First of all we need to reference the Android Restlet jars (by default C:\Program Files (x86)\Restlet Framework\Restlet Framework\Edition Android\2.0.4\lib). Add these libraries:

• org.restlet.jar
• org.restlet.ext.odata.jar
• org.restlet.ext.atom.jar
• org.restlet.ext.xml.jar
• org.restlet.ext.net.jar

b. After creating an instance of the proxy, you are ready to call any method you want:

IDataServiceProxy proxy = new DataServiceAtomPubProxyImpl(DataServiceClient.URL);

9. That’s it! Here is a screen capture of our Android App consuming oData:

Now we will see some examples that show how to work with the proxy we just created.

The following example gets all the café entities and displays some of their properties:

`TestAssociationOneToOneService service = new TestAssociationOneToOneService();Query<Cafe> query = service.createCafeQuery("/Cafes");for (Cafe Cafe : query) {  System.out.println(“id: ” + Cafe.getID());  System.out.println(“name: ” + Cafe.getName());}`

The following example gets a single entity and displays some of its properties:

`Query<Cafe> query = service.createCafeQuery("/Cafes('1')");Cafe Cafe = query.iterator().next();System.out.println(“id: ” + Cafe.getID());System.out.println(“name: ” + Cafe.getName());`

`Cafe Cafe = new Cafe();Cafe.setID("3");Cafe.setZipCode(12345);Cafe.setName("Bar des sports");Cafe.setCity("Paris");`
`service.addEntity(Cafe);`

Delete an Entity:

`Query<Cafe> query = service.createCafeQuery("/Cafes('1')");Cafe Cafe = query.iterator().next();`
`service.deleteEntity(Cafe);`

Get a single Entity and its associated entities:

Query<Cafe> query = service.createCafeQuery("/Cafes('1')").expand("Item");

# Conclusions

We saw that is relatively easy to query and update data exposed by oData using the Restlet library. This article together with Consuming OData from iPhone and Consuming OData from Windows Phone7 conform our series of articles on how to consume oData from mobile applications. oData is definitely an important tool to consider when architecting multi platform service oriented applications.

I hope you liked the article. Thanks for reading!

Diego Acosta

Posted: Sep 13 2011, 11:48 PM by uruit
Filed under: ,

# Introduction

There comes a time in the life of every .NET developer when you need Visual Studio to do something that
can only be described as a "class breakpoint": a quick command to set a breakpoint on every access to a class.
Unfortunately, after googling this concept, you'll find out that there's no easy way to accomplish this in Visual Studio.
development environment:

When the command is activated, it sets a breakpoint on every function and property of every class in the current document:

Also, note that this addin is language agnostic, meaning that it will work for C#, Visual Basic, and even native C++ applications.

In the rest of the article I'll show the basic steps to create a simple addin for Visual Studio 2010.

Visual Studio makes it easy to create an addin project by providing a template. In the New project dialog,
select Other project types, extensibility, Visual Studio Add-In.
You'll see that a very simple project is created, with the core logic around a class named Connect. This class manages
the lifecycle of the addin through the methods OnConnection, OnDisconnection, etc.
The class field _applicationObject holds a DTE2 object through which we communicate with the environment.

# Handling events

In this particular case we want to add a command to the Debug menu after a solution is loaded. Therefore, we will need
to wait until a solution is loaded. All the solution events are exposed through the DTE2.Events.SolutionEvents object:

```    _solutionEvents = _applicationObject.Events.SolutionEvents;      _solutionEvents.Opened += new _dispSolutionEvents_OpenedEventHandler(OnSolutionOpened);
_solutionEvents.AfterClosing += new _dispSolutionEvents_AfterClosingEventHandler(OnSolutionClosed);```

There's a minor caveat here. I'm keeping the reference to the SolutionEvents object in a field of the Connect class.
If I didn't do this, the SolutionEvents object would be deleted by the garbage collector, and the events would never
be raised.

Once that we handle the opening event, we need to add the command to the user interface:

`    object[] contextGUIDS = new object[] { };      Commands2 commands = (Commands2)_applicationObject.Commands;      string debugMenuName = "Debug";`
`    //Place the command on the debug menu.      //Find the MenuBar command bar, which is the top-level command bar holding all the main menu items:       Microsoft.VisualStudio.CommandBars.CommandBar menuBarCommandBar =           ((Microsoft.VisualStudio.CommandBars.CommandBars)_applicationObject.CommandBars)["MenuBar"];`
`    //Find the Debug command bar on the MenuBar command bar:       CommandBarControl debugControl = menuBarCommandBar.Controls[debugMenuName];       CommandBarPopup debugPopup = (CommandBarPopup)debugControl;`
`    _command =         commands.AddNamedCommand2          (              _addInInstance,              "CommandName",              "Text to show in the menu",              "Description of the command",              true,              Type.Missing,              ref contextGUIDS,              (int)vsCommandStatus.vsCommandStatusSupported + (int)vsCommandStatus.vsCommandStatusEnabled,              (int)vsCommandStyle.vsCommandStylePictAndText,              vsCommandControlType.vsCommandControlTypeButton          );             _command.AddControl(debugPopup.CommandBar, 1);`

This code grabs the Debug menu and adds the command with the specified parameters. This code should be wrapped in a
try-catch block to handle cases when the command already exists in the menu.
We can also add a keyboard shortcut to the command in the following way:

`    _command.Bindings = "Text Editor::ctrl+d, z"; `

Here "Text Editor" defines the scope of the shortcut. For more information see http://msdn.microsoft.com/en-us/library/envdte.command.bindings.aspx

# Browsing the code

Visual Studio automatically parses the current document and exposes a nice interface to browse the code. A code document
contains a tree of code elements. Each code element can be a namespace, a class, a method, etc, and it contains a
collection of child code elements in it. The root code elements can be accessed in this way:

`    CodeElements elementsInDocument = this._applicationObject.ActiveDocument.ProjectItem.FileCodeModel.CodeElements `

To show the browsing algorithm, here's a recursive method that shows how to get all the classes in the current document:

`    private static void RecursiveClassSearch(CodeElements elements, List<CodeClass> foundClasses)     {         foreach (CodeElement codeElement in elements)         {             if (codeElement is CodeClass)             {                 foundClasses.Add(codeElement as CodeClass);             }             RecursiveClassSearch(codeElement.Children, foundClasses);         }     }     `

# Managing breakpoints

Managing breakpoints is very straighforward. The interface exposed through this._applicationObject.Debugger.Breakpoints
is pretty self descriptive, and it contains functionally to add, remove and browse through breakpoints.

# Installer

Once you finished you addin, the best way to distribute it is to use a Visual Studio Installer project. An addin consists
of only two files: an *.AddIn xml file and a dll. The easiest way to distribute them is to install them in the same
directory, anywhere on the target machine (might be in ProgramFiles), and to add that directory to the addins directories
of Visual Studio. The latter can be done easily with a registry key: In the registry editor window of your installation
project, add a string key at "HKLM\Software\Microsoft\VisualStudio\10.0\AutomationOptions\LookInFolders" with the name
[TARGETDIR] and a descriptive name in the value. The installer will resolve the [TARGETDIR] placeholder at runtime.

# CodePlex project

Here's the CodePlex site for this project: http://breakall.codeplex.com

# Conclusions

In this article I presented a useful addin for Visual Studio and I also showed how to create customs addins. For more information on creating addins you can visit the MSDN: http://msdn.microsoft.com/en-us/vstudio/ff677564. I hope you find the addin useful as I do (maybe I'll publish a second version in the future) and I hope to see your great addins soon!