Friday, 21 November 2008

Using a Custom Database for reporting queries

Very often you need to create custom reports, and sometimes the queries are non-trivial. At this point you need to create a custom database because this keeps you supported (MS) and simplifies maintenance - particularly in multi-tenant scenarios. Note: This article assumes that all relevant databases are on the same server.

But what about security and other settings - what do you need on your custom database?

If you are basing your custom reports on the stock ones (or otherwise), you may also find it very useful to keep using the common shared datasource. You are then easily able to return tenant-specific formats, culture info etc.

This all seems a bit of a nightmare to wade through. How can you do this? Well, it is possible, and this article shows you how, based on my own experience...

First of all, consider the collation of your custom database and make it the same as your tenant databases (CI_AI). If you don't you will later have collation compatibility issues that you don't want or need.

Next, users, roles and security. You need to make users and give roles to both network service and CRM reporting group. You can use the following script to do this:

USE [CUSTOMDB]
GO
CREATE USER [NT AUTHORITY\NETWORK SERVICE] FOR LOGIN [NT AUTHORITY\NETWORK SERVICE] WITH DEFAULT_SCHEMA=[dbo]
EXEC sp_addrolemember 'db_owner',[NT AUTHORITY\NETWORK SERVICE]
GO
DECLARE @SQLAccessGroup VARCHAR(255)SELECT TOP 1 @SQLAccessGroup=name FROM sys.server_principals WHERE CHARINDEX('SQLAccessGroup',name)<>0
EXEC ( 'CREATE USER [' + @SQLAccessGroup + '] FOR LOGIN [' + @SQLAccessGroup + ']' )EXEC ( 'sp_addrolemember ''db_owner'', [' + @SQLAccessGroup + ']' )
GO

So now you have your custom database setup and ready to populate with views, stored procedures, functions etc. If you are working in a multi-tenant environment (or even otherwise), and want to keep the standard shared datasource, you should aim to pass in the organization name from your report, apart from any other parameters. Consider using a query similar the following in each report.

<Query>
<DataSourceName>CRM</DataSourceName>
<CommandText>
DECLARE @orgname Varchar(100)
SELECT @orgname = Name FROM OrganizationBase
EXEC [CUSTOMDB].dbo.usp_customsp @orgname, @filter
</CommandText>
<QueryParameters>
<QueryParameter Name="@filter">
<Value>=Parameters!CRM_xxxxxxxxxxxxxxxxx.Value</Value>
</QueryParameter>
</QueryParameters>
</Query>

The final piece of the puzzle is creating a function in your custom database to return the specific tenant's database name. You can use this in your dynamic SQL to retrieve your query information from the right database.

CREATE FUNCTION [dbo].[GetDBName] ( @orgname varchar (100))RETURNS varchar (100)AS BEGIN
DECLARE @dbname varchar (100)
SELECT @dbname = DatabaseName FROM MSCRM_CONFIG.dbo.Organization WHERE (UniqueName = @orgname) OR (FriendlyName = @orgname)
RETURN @dbname
END
GO

For anyone who is starting to write custom reports, this information would have saved me a lot of time. I hope it helps someone?

Thursday, 31 July 2008

And now... Multi-lingual customizations

Recently faced with the task of ensuring my add-in (VEM) was usable in at least the most common international languages, I explored further on the GoogleAPI theme and wrote a very small simple utility for adding translations to the MSCRM4 customizations file.

http://www.vizola.com/vem/TranslateXML

This means you can do a customizations export from a system having only (say) 1033 - US English, then add the translations using this utility, and re-import into a another system using a different language or multiple languages. It's somewhat quicker than using the built-in translations export/import process!!!

Do keep in mind that we're using 'machine' translation here so the result will ideally be sanity-checked by a real person :-)

This utility translates from/to english, german, french, italian, dutch and spanish. As long as you have at least one of these to start, it will fill in the gaps as needed.

One thing I found however, is that MSCRM will NOT 'back-fill' any new translations (which is a real pain!) - this means that if say custom entity/attributes already exist in a single language, you need to delete them (together with any data) before you can import the multi-lingual versions. Making a utility to work with the CrmTranslations.xml might not be so easy!

BTW, the GoogleAPI currently offers about 40 languages so adding/changing the current set is pretty easy. And if anyone wants the simple source just email me.

Monday, 23 June 2008

Multi-lingual SSRS reports

So you want to make your reports multi-lingual, sensitive to each user's browser language preference. Wow! That sounds like a big job. That's what I thought first of all, but there was a requirement, so a bit of investigation later, it seems like things have got a bit easier...

First of SSRS reports expose a global parameter [User!Language] which reflects the user's primary language (top of list in IE). This is very helpful because it's not possible to find this out client-side in Javascript (after much trying!). It's only available as a server-side parameter.

Secondly, thanks to Google's TranslateAPI, you can automate a 'rough-cut' translation for the labels and tooltips which you will want to be language-sensitive.

Thirdly, in reports you can substitute your 'hard-coded' label for an embedded function call to retrieve something that is language-sensitive. Although it's a little annoying that you must pass into this function any references as the functions are operating in a 'sandbox'. Anyway, A decision I made was to store all the translations in the report RDL itself, rather than in the database - this might not be best for everyone. The translations can be stored as hidden parameters and retrieved by the embedded function.

eg. =Code.GetLabel(User!Language, Parameters!Labels, "Contact owner")

You've got all the tips now, but it still seems some work to do a number of reports. So the job really comes down to mechanising the whole operation, and for that I can show some code I used. Credit must go to all the sources I used to find most of the techniques, of which there were several.

Anyway I knocked up a utility to load up the RDL DOM, make a nodelist of the labels and tooltips and substitute the function call, whilst taking the previously existing (en) text, 'Googlising' it into 5 other languages and storing it as hidden parms.

Voila. Enjoy!

XmlDocument xml = new XmlDocument();
xml.Load(sFile);
//temporarily remove default namespace to avoid excess references
//also simplifies the code considerably!
string defns = xml.DocumentElement.NamespaceURI;
xml.LoadXml(xml.OuterXml.Replace("xmlns=\"" + defns + "\" ", ""));
XmlNode code = xml.SelectSingleNode("//Code");
if (code == null)
{//then make this node
XmlNode ds = xml.SelectSingleNode("//DataSets");
code = xml.CreateElement("Code");
xml.DocumentElement.InsertBefore(code, ds);
}
if (! code.InnerText.Contains("Public Function GetLabel"))
{//then add required function
code.InnerText += "Public Function GetLabel(Lang as String, P as Parameter, Label as String) as String" + Environment.NewLine;
code.InnerText += "Dim i As Integer" + Environment.NewLine;
code.InnerText += "For i = 0 to Ubound(P.Value)" + Environment.NewLine;
code.InnerText += "If (P.Value(i) = Left(Lang,2) + \"_\" + Label) Then Return P.Label(i)" + Environment.NewLine;
code.InnerText += "Next i" + Environment.NewLine;
code.InnerText += "Return Label" + Environment.NewLine;
code.InnerText += "End Function";
}

XmlNode parms = xml.SelectSingleNode("//ReportParameters");
XmlNode parm = parms.SelectSingleNode("//ReportParameter[@Name='Labels']");
if (parm == null)
{//then make this node
parm = xml.CreateElement("ReportParameter");
parms.AppendChild(parm);
XmlAttribute nm = xml.CreateAttribute("Name");
nm.Value = "Labels";
parm.Attributes.Append(nm);
parm.InnerXml = "String";
parm.InnerXml += "";
parm.InnerXml += "true";
parm.InnerXml += "";
parm.InnerXml += "true";
}
XmlNode defvals = parm.SelectSingleNode("DefaultValue/Values");
XmlNode valvals = parm.SelectSingleNode("ValidValues/ParameterValues");
string[] lang = { "en", "it", "fr", "de", "nl", "es" };
Language[] lenum = { Language.English, Language.Italian, Language.French, Language.German, Language.Dutch, Language.Spanish };
XmlNodeList lbls = xml.SelectNodes("//Textbox/Value|//Textbox/ToolTip");
foreach (XmlNode lbl in lbls)
{
if (! lbl.InnerText.StartsWith("="))
{
string def = lbl.InnerText;
lbl.InnerText = "=Code.GetLabel(User!Language, Parameters!Labels, \"" + def + "\")";

for (int i = 0; i < 6; i++)
{
XmlNode defval = xml.CreateElement("Value");
defval.InnerText = lang[i] + "_" + def;
defvals.AppendChild(defval);
XmlNode valval = xml.CreateElement("ParameterValue");
valval.InnerXml = "" + lang[i] + "_" + def + "";
valval.InnerXml += "";
valvals.AppendChild(valval);
}
}
}
//add back the default namespace
XmlAttribute rdef = xml.CreateAttribute("xmlns");
rdef.Value = defns;
xml.DocumentElement.Attributes.Prepend(rdef);
xml.Save(sFile);

Programmatically adding reports in MSCRM v4

New in v4, lots more objects are now system entities within MSCRM, including REPORTS. This means that you have lots more control and flexibility in manipulating reports in code, so I was surprised that I couldn't find good examples for uploading them. In case anyone else is looking for an example, I'm posting this...

Back in v3 we used publish.config as a convenient way of itemising the reports of interest and how we wanted them installed, so I have used the same file, adding only one more attribute [@parentname] as v4 is more picky in this regard.

You will notice I'm using the xpath navigator rather than the xmldocument in order to pre-sort the reports and ensure that the parentreports get created first! There are probably better/more elegant ways of doing some parts for which comments are gladly received.

Enjoy!

Addendum:You need to set the languagecode in order to get them to show in all views, plus there are some useful other params for maintaining/updating the reports which could be added as attributes to the config file.

rpt.languagecode = new CrmNumber(1033);
rpt.signatureid = new UniqueIdentifier(Guid.NewGuid());
rpt.signaturedate = new CrmDateTime("2008/06/23");
rpt.signaturemajorversion = new CrmNumber(4);
rpt.signatureminorversion = new CrmNumber(0);
rpt.signaturelcid = new CrmNumber(1033);



string cfgfile = "publish.config";
string rptfile = "";

#region Open config and sort reports

//open the config file and sort the report nodes
XPathDocument doc = new XPathDocument(cfgfile);
XPathNavigator nav = doc.CreateNavigator();
XPathExpression xrpts = nav.Compile("//report");
xrpts.AddSort("@parentname", XmlSortOrder.Ascending, XmlCaseOrder.None, "", XmlDataType.Text);

#endregion

foreach (XPathNavigator xrpt in nav.Select(xrpts))
{//load each report inc parent link and related as necessary
rptfile = xrpt.GetAttribute("filename", "");
report rpt = new report();

#region Set report parameters

using (StreamReader reader = new StreamReader(rptfile))
{
rpt.bodytext = reader.ReadToEnd();
}
rpt.iscustomreport = new CrmBoolean(false);
rpt.ispersonal = new CrmBoolean(false);
rpt.isscheduledreport = new CrmBoolean(false);
rpt.name = xrpt.GetAttribute("name", "").ToString();
rpt.description = xrpt.GetAttribute("description", "").ToString();
rpt.reporttypecode = new Picklist(ReportTypeCode.ReportingServices);
rpt.filename = rptfile;

#endregion

#region Set the parent link if necessary

string pn = xrpt.GetAttribute("parentname", "");
if (pn != "")
{//get the Id of the first matching parent report
FilterExpression flt = new FilterExpression();
flt.Conditions.Add(new ConditionExpression("name", ConditionOperator.Like, pn));
QueryExpression qry = new QueryExpression(EntityName.report.ToString());
qry.Criteria = flt;
report prpt = (report)service.RetrieveMultiple(qry).BusinessEntities[0];
rpt.parentreportid = new Lookup(EntityName.report.ToString(), prpt.reportid.Value);
}

#endregion

Guid rptId = service.Create(rpt);

#region Execute SetReportRelatedRequest
SetReportRelatedRequest rrReq = new SetReportRelatedRequest();
rrReq.ReportId = rptId;

#region Set the categories for the report

string[] cats = xrpt.GetAttribute("category", "").Split(';');
ArrayList catsi = new ArrayList(); int tmp = 0;
for (int i = 0; i < cats.Length; i++)
{
int.TryParse(cats[i], out tmp);
catsi.Add(tmp);
}
rrReq.Categories = (int[])catsi.ToArray(typeof(int));

#endregion

#region Set the relationships for the report

string[] rents = xrpt.GetAttribute("relatedentity", "").Split(';');
ArrayList rentsi = new ArrayList();
if (rents.Length != 0)
{
for (int i = 0; i < rents.Length; i++)
{//get the OTC for each related entity
RetrieveEntityRequest req = new RetrieveEntityRequest();
req.RetrieveAsIfPublished = true;
req.LogicalName = rents[i];
req.EntityItems = EntityItems.EntityOnly;
RetrieveEntityResponse res = (RetrieveEntityResponse)mservice.Execute(req);
EntityMetadata entMdata = res.EntityMetadata;
rentsi.Add(entMdata.ObjectTypeCode.Value);
}
}
rrReq.Entities = (int[])rentsi.ToArray(typeof(int));

#endregion

#region Set the visibility for the report

string[] viss = xrpt.GetAttribute("showin", "").ToString().Split(';');
ArrayList visi = new ArrayList();
for (int i = 0; i < viss.Length; i++)
{
switch (viss[i])
{
case "reports": visi.Add(ReportVisibilityCode.ReportsGrid); break;
case "entity": visi.Add(ReportVisibilityCode.Grid); break;
case "form": visi.Add(ReportVisibilityCode.Form); break;
}
}
rrReq.Visibility = (int[])visi.ToArray(typeof(int));

#endregion

// Execute the request.
SetReportRelatedResponse reportRelatedResponse = (SetReportRelatedResponse)service.Execute(rrReq);

#endregion

}

Tuesday, 5 February 2008

Lookup Dialog Investigations

Recently there seems to have been renewed interest in trying to re-use the standard lookup dialogue inside the CRM system in different ways. One of the more frequent requests is to be able to filter the returned records in a custom way, so I decided to see in a bit more detail what the issues were. For example, say you set an account to be the customer for an opportunity and you wanted to select one or more significant contacts within the account as significant relationships (perhaps influencer and decision maker), then it would make sense to just to show contacts for the customer account. Well, you can't!

OK, it didn't take much searching before I found a request to Microsoft posted by Michael Hoehne way back in April 06. Now the definition for displayed entity types, columns and filtering is contained in a number (69) of lookupclass files in the Server\ApplicationFiles folder. Unfortunately, whereas you might think you could just create some more and use those this will invariably lead to a System.IO File not Found error. I even figured that perhaps these needed to be registered in the database. Not so, Michael found there is a hardcoded restriction which apparently is still there in CRM4.0. Of course that opened the door for the Stunnware custom dialog which relies on just this short-coming.

Then (back on our example problem of selecting account contacts for an opportunity) I chanced across an interesting blog by Darren Lui concerning how to invoke the lookup dialog and keep the contact in sync. if we're only interested in a single contact. Unfortunately this doesn't tackle the problem of filtering by account.

During the course of my investigations I did come across a couple of things which mean that the determined developer can achieve something:
  1. In fact there is a mechanism for passing in filtering parameters to the lookup dialog - using
    eg. additionalparams = "&customerid=" + crmForm.all.customerid.DataValue[0].id ;
  2. Although you can only define one entity columnset/filter per lookupclass file, you can 're-use' non-conflicting' files for other entity types eg. territory.xml contains a defintion for entityid=2013, but not entityid=2 (contact).

In the next installment, I shall hope to show how you can achieve useful custom functionality without 'serious' unsupported customisations.

Friday, 25 January 2008

Meeting room booking in MS CRM?

I have been moved to comment on this because I chanced across a couple of whitepaper downloads from Microsoft recently on the subject of Service Scheduling. It is suggested that even large scale companies can consider using MS CRM for their room booking requirements. You can find the performance testing results here. Please look at the type of high-performance hardware they deduce is necessary to meet their 'acceptable' response rates.

It took me back to a project we undertook for a global corporate at the beginning of the century(!). This large company was attempting to use Outlook / Exchange functionality for booking their 80+ meeting rooms. Finding problems, they first banned users from trying to book their own rooms because the network and servers began to get over-stressed with the loading, when in fact being a telecoms company their network bandwidth was enormous! Pretty soon this issue escalated to mission-critical when Exchange servers started dying all around.

So what we did was to decouple the two areas - we developed a web-based employee self-service application which is hosted within their portal, which is not that complex, and uses availability snapshots to present the user with booking choices. The user submits a request and if the resource is still available you get immediate confirmation. The design of the database and stored procedures means double-bookings are impossible.

But now of course there is the ever-present pressure to amalgamate applications with Total Cost of Ownership (TCO) arguments and the emergence of SOA. So what we have done is simply to make our room booking system accessible 'through' the MS CRM sitemap. Take a look at the first draft of a solution we can provide. And remember, you will not risk your CRM system becoming sluggish and unresponsive, but you will have an optimised solution for your business issue, which is accessible from anywhere and you won't have to buy user licenses just for users to access room booking.

If there is demand I can see a case for configuring the system using MS CRM entities, but I strongly believe that the 'booking engine' should remain optimised for its specific purpose. I'm looking to test the demand for a powerful quite inexpensive solution accessible through the add-in route, and would appreciate comments.

Monday, 21 January 2008

Email Source Edit Function

If you've ever used email quick campaigns or email activities in normal campaigns you've likely been frustrated at the spontaneously re-formatting that sometimes happens when you paste in some content from your favourite HTML editor program. Oh, why doesn't the editor have a 'View Source' button to see what's actually there? Well, this post shares with you how you can add such a feature in an unsupported but minimally invasive way.

First of all the concept. We know the page is generated in code and we're not about to hack into this territory! Fortunately though, it's quite easy to hook into the onload event and run some javascript to add a button which will do what we want. After this we need to use the (toggle) button to replace the standard WSIWUG editor iFrame with a textarea for the source view and a few lines of code to shift your content between the different elements. Really the only thing we have to ensure is that when editing in source mode we ensure the body tag is contentEditable to be able to edit again in WSIWYG!

Note: I have found that an invaluable tool in any exploit of this kind is the IE DOM Explorer (downloadable from Microsoft), which allows you to easily locate elements and test dynamic manipulation of the document object. Although if you're trying to get to pages displayed in dialogs you have to be a little inventive ;-)

Step 1: So to begin with we'll add a script to the page of interest (say emailForm.aspx) and attach an event handler to run this button with document.body.onload. You can see that basically we're going to replace the first toolbar button with a longer piece of HTML which also includes the new button and a spacer. An alternative and more elegant way would be to clone(deep) the element, setting the desired properties and then inserting the new element in the DOM.

//this is the spacer
sp = "<td unselectable="'on'">";
sp += " <div unselectable="'on'"></div></td>";

function addViewSourceBtn(){
var t = document.getElementById("HTMLBAR");//get the toolbar
var c = t.getElementsByTagName("TD")[0].outerHTML; //get the first btn
var v = c.replace("cmd-cut.gif","cmd-insertKB.gif"); //clone and change
v = v.replace("Cut","Source");
v = v.replace("HTMLBAR.htmlExec('cut')","ViewSource()");
t.outerHTML = t.outerHTML.replace(c,v+sp+c); //recreate the toolbar
}

Step 2: Anyway, the major part of the task is in the function which is called by the button.

  1. First of all we're going to initialise a variable to hold the current [mode] of the button in the script. Alternatively, we could have stored this as an expando on the button and changed its appearance perhaps. We also have a variable [RichEdit] which holds the initial iFrame HTML outerHTML, but we could easily hardcode it as per [TextEdit].
  2. When we switch to source edit mode, we're going to disable any wizard buttons (say in the quick campaign wizard) and hide all the other toolbar buttons.
  3. So then when we toggle to source edit we set the value of the textbox, and the other way we need to write the document for the iFrame.

That's it! Enjoy.

UPDATE: Of course the better way to do this is use this code in the customisable email onload handler in MSCRM. This saves you having to make any edits to the pages themselves which is strictly unsupported.

function ViewSource(){
// define different editors
var TextEdit = '<textarea ID="descriptionIFrame" style="width:100%;height:100%;"></textarea>';
var editor_obj = document.all['descriptionIFrame'];

if (mode == "textedit") {
mode = 'wysiwyg';
with(window.parent){
document.all['btn_id_Next'].disabled=true;
document.all['btn_id_Back'].disabled=true;
}
var editdoc = editor_obj.contentWindow.document;
var contents = editdoc.documentElement.outerHTML;
editor_obj.outerHTML = TextEdit;
var editor_obj = document.all['descriptionIFrame'];
editor_obj.value = contents;

b = document.all['HTMLBAR'].getElementsByTagName("TD");//hide wysiwyg buttons
for (i=0;i<b.length;i++){with(b[i]){ if (title!='Source')style.visibility='hidden'; } }

} else {
mode = 'textedit';
var contents = editor_obj.value;
editor_obj.outerHTML = RichEdit;
var editor_obj = document.all['descriptionIFrame'];
var editdoc = editor_obj.contentWindow.document;

editdoc.open();
editdoc.write(contents);
editdoc.close();

b = document.all['HTMLBAR'].getElementsByTagName("TD");//show wysiwyg buttons
for (i=0;i<b.length;i++){b[i].style.visibility='';}

with(window.parent){
document.all['btn_id_Next'].disabled=false;
document.all['btn_id_Back'].disabled=false;
}
}
}