Tuesday, June 17, 2014

Part 2 - Creating a Custom Data Import with Microsoft CRM SDK

Part 2 - Beginning the Import Code


Part 1 of this series showed you how to set up the connection to CRM in preparation for importing records from a SQL Server table.  The reason you may want to do this is that you have SQL server data in another system, or you are receiving files from an outside data source that reside in a data warehouse.  Part 2 is about writing the code to import records as Accounts in CRM 2011.

Create Your Staging Table

Creating a staging table for holding data temporarily while importing to CRM is helpful for several reasons.  First, if the schema or data in the external system you are exporting changes, you only need to fix the part that exports this data to your SQL table.  The worker code that imports data to CRM stays the same, because it is looking at the staging table.  Secondly, depending on where the source data is housed, your import to CRM can take significantly less time.  Such scenarios include your data warehouse being on a different server than your CRM, or the source data being in the cloud.  If your staging table is on the same server as CRM, your import will be faster. 

Notice I didn't state the same database, but the same server.  In my experience, best practice includes creating a separate database for your custom tables and stored procedures, so that if there is a change to the deployed database where it needs to be dropped and rebuilt, your custom code doesn't go away with it.

For this example we will create a new staging table in your support database.  Naming conventions are personal, so feel free to use whatever names work for you.  I have named my support database "MSCRM_Support" and created a staging table called "MSCRM_Support.dbo.tmpAccountImport".  At the very least, create the staging table, add the following fields to it and then save the table:


Figure 9


I am not going to get into primary keys, indexes or nulls at this point.  This is a separate topic for a possible later discussion.  

The AcctId should be the unique identifier from the system you are importing from.  In my case, I am importing it as a varchar but it can just as easily be any other type.  This as well as the other fields, with the exception of OwnerID, should be populated from your source data.  The OwnerName field is meant to bring over the owner from another system.  In my case, there is an owner on the Accounts in the source data.

So now we need to find the CRM OwnerID that matches to your Owner name from the source record.  What you need here is the SystemUserID (guid) from the SystemUserBase table, or SystemUser view.  You will want to update the CRMOwnerID field in your tmpAccountImport table with this guid.  Join the tmpAccountImport table with the SystemUserBase table in your main CRM database.  You can join on a number of different fields, depending on what your source data looks like.  Some appropriate fields could be FirstName, LastName, and FullName.  Since my source data has the user's full name, we will be using that.

Here is an example of what your update query might look like:

Figure 10

So what if you are importing some records that already have an owner, and some that do not?  What I do is designate a CRM user as a default user, and let the sales people assign the Accounts amongst themselves in the CRM system.  In this case, you can simply update the tmpAccountImport table with the guid for your default user, whenever something doesn't match.


Now you have your staging table ready to import to CRM.  

Creating the Structure for the Account Import Code

Start out by adding your connection string to the App.config file.  Here's an example of what yours should look like.  It should be within the tags but outside any of the others.
Figure 11


Now go back to Program.cs and add the variables from Figure 12, below.  Put them just under the variables we added in Part 1 of this series.  The fConn will hold our connection string.  "act" is for the Account entity we are working with.  sLastAcctId will hold the last Account information, in the event there is an error with the import we can tell which record broke it.  iCount keeps count of the total number of records imported.

Figure 12
Create a new (void) function and call it AccountImport().  Put it in the Program class but outside of any other functions.  Next you will need to create a variable of type MyServiceContext.  Remember this was the file you created in Part 1 that holds the entities bound to your CRM service.  Create a string variable and a new StringBuilder variable.  The StringBuilder variable will hold the SQL to retrieve data from your temporary table. Also create variables to hold the Guid for the CRM Account ID and the string AcctID that you are importing.  By now your code should look something like this:


Figure 13
Append to your StringBuilder variable the SQL statement that selects the records from your tmpAccountImport table.  I would suggest checking for nulls in each of the fields.  CRM sometimes doesn't like us to give it null values, an empty string is much better.  Here's an example:


Figure 14

Next, add a try/catch block, connect to the _serviceProxy.  Set up a new ColumnSet and add to it the names of fields that you need to see or compare in CRM.  Columns are added as a string array, and the column names are the actual field names in CRM but they are case sensitive so make sure you are using actual names.  Since we are importing/updating records, we only need to retrieve a few fields.  Retrieving more than you need will slow down your import.

You can now add code to connect to the database and read through the records.  Inside the "while" code block assign the sAcct variable the value of the "AcctId" field from the first record.  .  Your code should now look something like Figure 15:

Figure 15
This might seem like a lot but really there's not much new in this code.  If you've coded in C# before, you've connected to a database a million times, probably much like the above code.  The additional information here is in connecting to the CRM service itself and retrieving columns you need to use.


Part 3 of this series will finish out the import code and show how to use your connection string.  Finally, we will write out the results of the import.

Related Links:
Part 1 - Getting Started
Part 3 - Finishing the Import/Update Code 
Part 4 - Reconciling More than One Match - Merging Accounts


21 comments:

  1. I was blown out after viewing the article which you have shared over here. So I just wanted to express my opinion on Data Science, as this is best trending medium to promote or to circulate the updates, happenings, knowledge sharing.. Aspirants & professionals are keeping a close eye on Data science course in Mumbai to equip it as their primary skill.

    ReplyDelete
  2. I feel very grateful that I read this. It is very helpful and very informative and I really learned a lot from it.
    machine learning and artificial intelligence courses in bangalore

    ReplyDelete
  3. Hi! This is my first visit to your blog! We are a team of volunteers and new initiatives in the same niche. Blog gave us useful information to work. You have done an amazing job!
    Data Analytics Course in Mumbai

    ReplyDelete
  4. This is a wonderful article, Given so much info in it, These type of articles keeps the users interest in the website, and keep on sharing more ... good luck.
    ExcelR Business Analytics Course

    ReplyDelete
  5. Excellent Blog! I would like to thank for the efforts you have made in writing this post. I am hoping the same best work from you in the future as well. I wanted to thank you for this websites! Thanks for sharing. Great websites!
    data analytics course in hyderabad

    ReplyDelete
  6. This is a wonderful article, Given so much info in it, These type of articles keeps the users interest in the website, and keep on sharing more ... good luck.
    data analytics courses

    ReplyDelete
  7. I really enjoy simply reading all of your weblogs. Simply wanted to inform you that you have people like me who appreciate your work. Definitely a great post. Hats off to you! The information that you have provided is very helpful.

    data analytics courses

    business analytics course

    data science interview questions

    data science course in mumbai

    ReplyDelete
  8. The information provided on the site is informative. Looking forward more such blogs. Thanks for sharing .
    Artificial Inteligence course in Nashik
    AI Course in Nashik

    ReplyDelete
  9. o Thank you for helping people get the information they need. Great stuff as usual. Keep up the great work!!!
    pmp certification course training in Guwahati

    ReplyDelete
  10. This Was An Amazing ! I Haven't Seen This Type of Blog Ever ! Thankyou For Sharing data science training in Hyderabad

    ReplyDelete
  11. I have to search sites with relevant information on given topic and provide them to teacher our opinion and the article.

    Simple Linear Regression

    Correlation vs covariance

    KNN Algorithm

    Logistic Regression explained

    ReplyDelete
  12. It's late finding this act. At least, it's a thing to be familiar with that there are such events exist. I agree with your Blog and I will be back to inspect it more in the future so please keep up your act. ExcelR Machine Learning Course

    ReplyDelete
  13. Such a very useful article. Very interesting to read this article.I would like to thank you for the efforts you had made for writing this awesome article. data scientist courses

    ReplyDelete
  14. This knowledge.Excellently written article, if only all bloggers offered the same level of content as you, the internet would be a much better place. Please keep it up.
    artificial intelligence course in aurangabad

    ReplyDelete
  15. The information you have posted is very useful. The sites you have referred was good. Thanks for sharing.
    best data science course online

    ReplyDelete
  16. I think about it is most required for making more on this get engaged
    data scientist training and placement

    ReplyDelete
  17. I feel very grateful that I read this. It is very helpful and very informative and I really learned a lot from it.
    cyber security course

    ReplyDelete
  18. Wonderful blog post. This is absolute magic from you! I have never seen a more wonderful post than this one. You've really made my day today with this. I hope you keep this up! data science course in Shimla

    ReplyDelete