Chapter 13. Interfacing with Tabular

You can create an Analysis Services solution by using existing development and client tools, such as SQL Server Data Tools (SSDT), Power BI, or Excel. Using the libraries supported in script and programming languages, you can customize these solutions. In this chapter, you will learn about these libraries for defining models and performing administrative tasks. To better understand this chapter, you will need a basic knowledge of PowerShell and/or managed languages, such as C#.

The goal of this chapter is to introduce all the libraries available to programmatically access tabular models in Analysis Services so you will be able to evaluate the correct approach based on the requirements. You will find several links to documentation and examples throughout the chapter to help you understand all the details and possible parameters of available functions.

This chapter does not cover implementing query support from client code (which is possible using the ADOMD.NET managed library) or the native Analysis Services OLE DB Provider (also known as MOLAP provider). Documentation for these topics can be found at https://msdn.microsoft.com/en-US/library/bb500153.aspx.


Image What’s new in SSAS 2016

SSAS 2016 introduces a new library, called Tabular Object Model (TOM). This chapter, which is brand new, covers the architecture of this library, including several examples on using it with C# and PowerShell.


Introducing the AMO and TOM libraries

Analysis Services has a generic API called Analysis Management Objects (AMO) to administer both tabular and multidimensional deployments. Certain classes are shared between the two modes, whereas others are specific to each model type. The specific API for tabular models is included in an extension of AMO called Tabular Object Model (TOM). It is correct to reference both APIs as AMO, even if TOM is used only to create and manage Tabular metadata objects. To make a distinction between APIs for multidimensional and tabular models, however, it is common to use AMO for Multidimensional, and TOM for Tabular. Nevertheless, TOM still uses the AMO libraries that are shared between the two modes.

Starting with Analysis Services 2016, the original AMO library was divided into the following different assemblies:

Image Core (Microsoft.AnalysisServices.Core.dll) This contains classes common to both Tabular and Multidimensional.

Image TOM (Microsoft.AnalysisServices.Tabular.dll) This includes classes to create and manage a tabular model.

Image AMO (Microsoft.AnalysisServices.dll) This includes classes to create and manage a multidimensional model.

Image Json (Microsoft.AnalysisServices.Tabular.Json.dll) This is an assembly used internally by TOM that controls the serialization of metadata objects.

The TOM library can only manage tabular models with a compatibility level of 1200 or higher. If you want to manage models with the compatibility levels of 1050 through 1103, you need the AMO library. You can find more details on assembly dependencies and deployments at https://msdn.microsoft.com/en-us/library/mt707783.aspx.

As you will see later, there are classes with the same names defined in different namespaces to avoid name conflict between AMO and TOM libraries. Depending on the compatibility level you want to support in your code, you must create a class instance from either the Microsoft.AnalysisServices namespace (AMO, for compatibility levels 1050 through 1103) or the Microsoft.AnalysisServices.Tabular namespace (TOM, for compatibility levels 1200 or higher).

Introducing AMOs

This section identifies the specific set of classes and assemblies that are shared between multidimensional and tabular models. Analysis Services Management Objects (AMOs) include classes defined in two assemblies: AMO (Microsoft.AnalysisServices.dll) and Core (Microsoft.AnalysisServices.Core.dll). This division is required for compatibility with existing code that is designed for former versions of AMO. It supports access to SSAS databases with the compatibility levels of 1050 through 1103. Legacy code directly references only the AMO assembly and the Microsoft.AnalysisServices namespace.

For example, the AMO assembly code in Listing 13-1 connects to the Tabular instance of Analysis Services on the local machine, iterates the databases, and displays the list of dimensions for each one. Such code works for both Tabular and Multidimensional instances of Analysis Services. For a Tabular instance, each table corresponds to a dimension in the metadata that is provided to AMO.

Listing 13-1 Script PowerShellList Tabular Tables 1103.ps1


[System.Reflection.Assembly]::LoadWithPartialName("Microsoft.AnalysisServices")
$server = New-Object Microsoft.AnalysisServices.Server
$server.Connect("localhost abular")
foreach ( $db in $server.Databases ) {
    $db.Name
    foreach ( $table in $db.Dimensions ) {
        "-->" + $table.Name
    }
}


When connecting to a Tabular instance, the output includes the list of tables in databases with compatibility levels 1050 through 1103. There are no entries for databases with the compatibility level 1200 or higher. For example, in the following output there are three databases: AdventureWorks (with compatibility level 1103), Contoso, and Budget (both with compatibility level 1200). Only the AdventureWorks tables are included in the following output:

AdventureWorks
-->Currency
-->Customer
-->Date
-->Employee
-->Geography
-->Product
-->Product Category
-->Product Subcategory
-->Promotion
-->Reseller
-->Sales Territory
-->Internet Sales
-->Product Inventory
-->Reseller Sales
-->Sales Quota
Contoso
Budget

To access tables with compatibility level 1200 or higher, you must use the TOM assembly to get the Server and Database class instances from a different namespace. AMO exposes metadata for multidimensional models, because it was originally designed for that type of database. Previous versions of Analysis Services leveraged this existing infrastructure to expose database entities. When you access the compatibility levels 1050 through 1103, you must map tabular entities to the multidimensional concepts. For example, every table in a tabular model corresponds to a dimension in a multidimensional one. This book does not cover these legacy database models. For more information, you can reference the documentation at https://msdn.microsoft.com/en-us/library/hh230795.aspx.


Image Note

If you want to manage tabular models with compatibility levels 1050 through 1103, we suggest using the Tabular AMO 2012 library, which is available on CodePlex at https://tabularamo2012.codeplex.com/. This library is an AMO wrapper for multidimensional models, which exposes an object model close to the one provided by TOM.


Introducing the TOM

Tabular models with the compatibility level of 1200 or higher can be managed through the TOM library, which includes classes in three assemblies: Core, TOM, and Json. You only need to reference the AMO assembly if you want to access metadata from older compatibility levels, such as the Dimensions collection, which is shown in the previous example.

The TOM assembly defines all the classes in the Microsoft.AnalysisServices.Tabular namespace, whereas the Core assembly contains classes in the following two namespaces:

Image Microsoft.AnalysisServices.Core This includes an abstract definition of the Database, Role, Server, and Trace classes. You do not need to reference such a namespace in your code. These classes are implemented in the Microsoft.AnalysisServices AMO assembly namespace and in the Microsoft.AnalysisServices.Tabular TOM assembly.

Image Microsoft.AnalysisServices This includes objects for managing traces, security, XMLA commands, connections, impersonation, and more. These entities are shared between AMO and TOM libraries.

If your solution only references the TOM and Core assemblies, you can import both namespaces, but only a compatibility level of 1200 and higher will be supported. You might reference both AMO and Core assemblies, but in this case, you must consider possible ambiguity in class references. For example, suppose you create a script or tool that supports both the 1103 and 1200 compatibility levels. If so, you must be careful in choosing the namespace from which you reference the Database, Role, Server, and Trace classes.

The code in Listing 13-2 references only the TOM assembly. It connects to the Tabular instance of Analysis Services on the local machine, iterates all the databases, and for each one, displays the list of tables included in the model.

Listing 13-2 Script PowerShellList Tabular Tables 1200.ps1


[System.Reflection.Assembly]::LoadWithPartialName("Microsoft.AnalysisServices.
Tabular")
$server = New-Object Microsoft.AnalysisServices.Tabular.Server
$server.Connect("localhost abular")
foreach ( $db in $server.Databases ) {
    $db.Name
    foreach ( $table in $db.Model.Tables ) {
        "-->" + $table.Name
    }
}


This code only works for Tabular instances of Analysis Services, and its output only includes the list of tables of a model in compatibility levels 1200 or higher. Such a list is empty for databases in compatibility levels 1050 through 1103. For example, the following output shows three databases: AdventureWorks (with a compatibility level of 1103), Contoso, and Budget (both with a compatibility level of 1200). Only the tables from Contoso and Budget are included in the output.

AdventureWorks
Contoso
-->Date
-->Sales
-->Currency
-->Product
-->Promotion
-->Store Budget
-->Product
-->Date
-->Sales
-->Territory
-->Budget

As you see, supporting both compatibility models requires additional coding and the management of different implementations of the same abstract classes. For the remainder of the chapter, we will consider how to interface with compatibility level 1200 or higher. In this scenario, you reference only the Core and TOM libraries, creating instances of Server, Database, Role, and Trace classes from the Microsoft.AnalysisServices.Tabular namespace.

The object hierarchy of the classes available in the TOM library is shown in Figure 13-1. The Server class is the root of the hierarchy. It has a Databases property with a list of Database instances. The Database class has a Model property, which contains an instance of the Model class. This is the entry point for the metadata information that is specific to a tabular model with a compatibility level of 1200 or higher. Most of the other properties of the Server and Database classes are common to other tabular and multidimensional models.

Image

Figure 13-1 The object hierarchy in the TOM library.

If you want to look for valid tabular databases in an SSAS instance, you should first check the ServerMode property of the Server class. If it is Tabular, then you should analyze the StorageEngineUsed property of the Database class. For a tabular model, its value can be InMemory for compatibility levels 1050 through 1103, or it can be TabularMetadata for compatibility levels 1200 or higher. However, if you connect to an SSAS instance in Tabular mode, you can simply check whether the Model property is null before accessing it. While PowerShell automatically applies these checks, you need to be more explicit when writing similar code in C#. Listing 13-3 shows how you might check that the Model property is not null rather than evaluating the StorageEngineUsed property.

Listing 13-3 ModelsChapter 13List Tabular Tables 1200List Tabular Tables 1200.cs


using System;
using Microsoft.AnalysisServices;
using Microsoft.AnalysisServices.Tabular;

namespace ListTables {
    class Program {
        static void Main(string[] args) {
            Server server = new Server();
            server.Connect(@"localhost abular");
            if (server.ServerMode == ServerMode.Tabular) {
                foreach (Database db in server.Databases) {
                    Console.WriteLine("{0}:{1}", db.ToString(), db.StorageEngineUsed);
                    if (db.StorageEngineUsed == StorageEngineUsed.TabularMetadata) {
                        foreach (Table d in db.Model.Tables) {
                            Console.WriteLine("--> {0}", d.Name);
                        }
                    }
                }
            }
            server.Disconnect();
        }
    }
}


The Model class contains the same entities that are described in Chapter 7, “Tabular Model Scripting Language (TMSL).” In fact, TMSL is just the materialization of the object graph that is included in the Model class. In the section “Automating project deployment” later in this chapter, you will find functions to both read a model.bim file in memory by populating a Model class instance and create a model.bim file by just persisting the state of a Model instance. In fact, you can create and manipulate a tabular model without actually connecting to an SSAS instance. When you read the database metadata from a server, you have an object graph describing the database model. Any changes applied to this object graph are local to your code until you apply the changes to the server by invoking the SaveChanges method of the Model instance. For example, the script in Listing 13-4 adds a Margin measure to the Sales table in the Contoso database.

Listing 13-4 Script PowerShellAdd Margin Measure.ps1


[System.Reflection.Assembly]::LoadWithPartialName("Microsoft.AnalysisServices.Tabular")
$server = New-Object Microsoft.AnalysisServices.Tabular.Server
$server.Connect("localhost ab16")
$db = $server.Databases["Contoso"]
$model = $db.Model
$tableSales = $model.Tables["Sales"]
$measureMargin = New-Object Microsoft.AnalysisServices.Tabular.Measure
$measureMargin.Name = "Margin"
$measureMargin.Expression = "[Sales Amount] - [Cost]"
$tableSales.Measures.Add( $measureMargin )
$model.SaveChanges()


When you invoke SaveChanges, the TOM library manages the communication to the SSAS instance by using XMLA and JSON protocols. The following sections provide a short description of these protocols to help you better understand the content of certain trace events if you use the SQL Server Profiler (as explained in Chapter 14, “Monitoring and tuning a Tabular service”), but you can safely ignore this underlying communication by using TOM.

TOM and XMLA

The TOM library uses the industry-standard XML for Analysis (XMLA) protocol to communicate with an SSAS instance, as described here: https://msdn.microsoft.com/en-us/library/ms977626.aspx. To manage tabular objects, Microsoft extended the XMLA protocol to add the SQL Server Analysis Services Tabular protocol (MS-SSAS-T) for use with compatibility levels 1200 and higher. For more information, see https://msdn.microsoft.com/library/mt719260.aspx.

TOM and JSON

The metadata of tabular models is structured as JSON documents, which can be transferred using a JSON syntax that adds command and object model definitions. This syntax is described in Chapter 7.

When you use the TOM library, the communication with SSAS uses the MS-SSAS-T protocol over the wire, not TMSL. Nevertheless, both TOM and TMSL share the same object structure (tables, columns, partitions, and so on) and the same operations (create, delete, and refresh). When you use TMSL, you send commands in JSON format over the wire to the SSAS server. If you use the Profiler to analyze the SSAS instance events, you will see different content in the resulting capture than when you execute the same logical operations by using the TOM library and TMSL scripts.

Should you use TOM or TMSL to interface with a Tabular instance? The answer depends on your requirements. A TMSL script can be easily executed through SQL Server Management Studio (SSMS) or a SQL Server Agent job. You can also execute a TMSL script in PowerShell by using the Invoke-ASCmd cmdlet or the Execute method of the Microsoft.AnalysisServices.Tabular.Server class in the TOM library. However, TMSL defines operations only at the database, table, partition, or role level. When you need to operate at a finer grain, you should consider using TOM. For example, if you want to add, change, or delete a single measure or column in a table, you can easily apply the individual changes by using TOM, whereas you must include the entire table definition when using TMSL.

Finally, it is important to note that you can generate a TMSL script by using the TOM library. For example, you can create a model dynamically by using the TOM classes. Then you can generate a TMSL script to create, delete, or alter a tabular object by using one of the methods included in the JsonScripter class. This class is described at https://msdn.microsoft.com/en-us/library/microsoft.analysisservices.tabular.jsonscripter.aspx.

Introducing the TMSL commands

A TMSL command is a JSON document that describes the request of an operation to an SSAS Tabular instance. You can use TMSL to alter objects in a database, refresh data, manage databases, or execute multiple operations to control the parallelism of the execution. For example, the TMSL command in Listing 13-5 requires a full process of the Contoso database.

Listing 13-5 Script TMSLProcess Contoso Database.xmla


{
  "refresh": {
    "type": "full",
    "objects": [
      {
        "database": "Contoso",
      }
    ]
  }
}


As you can see, a TMSL command does not contain references to a specific server, and is executed by the SSAS instance receiving it. You can find a description of the available commands in Chapter 7 in the section “TMSL commands.”

You can create a TMSL script by using the TOM library and the JsonScripter class without connecting to a server. To do so, you must include the minimal definition of object entities in a Model object that is within a Database instance. For example, the C# code in Listing 13-6 generates a TMSL command to refresh two tables (Sales and Customer) in the Contoso database.

Listing 13-6 ModelsChapter 13Generate TMSL RefreshGenerate TMSL Refresh.cs


using System;
using Microsoft.AnalysisServices.Tabular;

namespace Generate_TMSL_Refresh {
    class Program {
        static void Main(string[] args) {
            Database dbContoso = new Database("Contoso");
            dbContoso.Model = new Model();
            Table tableSales = new Table { Name = "Sales" };
            Table tableCustomer = new Table { Name = "Customer" };
            dbContoso.Model.Tables.Add(tableSales);
            dbContoso.Model.Tables.Add(tableCustomer);
            string tmsl = JsonScripter.ScriptRefresh(
                new Table[] { tableSales, tableCustomer },
                RefreshType.Full);
            Console.WriteLine( tmsl );
        }
    }
}


The tmsl string is assigned to the TMSL script, as follows:

{
  "refresh": {
    "type": "full",
    "objects": [
      {
        "database": "Contoso",
        "table": "Sales"
      },
      {
        "database": "Contoso",
        "table": "Customer"
      }
    ]
  }
}

The previous example illustrates that you do not need a connection to SSAS to generate a TMSL script. However, you can obtain the same result by connecting to an existing database using TOM, and then using the model entities that are populated when you connect to the database. If you are already using TOM, you can apply changes and send commands by using the native TOM functions, which is more efficient and provides more control. You should generate TMSL when you do not have direct access to the SSAS instance to execute the command (for example, scheduling the execution by using a SQL Server Agent job).

Creating a database programmatically

Using TOM, you can create a complete database programmatically. You only need to reference the Core and TOM assemblies. Then you can apply the using Microsoft.AnalysisServices.Tabular directive in your source code so that only the classes for tabular models will be used. In the following example, you will create a database with just two tables from Contoso (Sales and Customer), with a relationship between the two tables and a measure to compute Sales Amount.

You can use the default values of a new instance of the Model class, but you must define at least one data source to retrieve data for your tables. Once completed, add an instance of the Database class to a Server object that is connected to an SSAS Tabular instance. The sample code in Listing 13-7 creates a connection to a ContosoDW database that is hosted on Microsoft SQL Server.

Listing 13-7 ModelsChapter 13Create Tabular DatabaseCreate Tabular Database.cs


// Create the model and data source
Model smallModel = new Model();
smallModel.DataSources.Add(
    new ProviderDataSource() {
        Name = "ContosoDW",
        ConnectionString = @"Provider=SQLNCLI11;Data Source=localhost;" +
            @"Initial Catalog=ContosoDW;Integrated Security=SSPI;" +
            @"Persist Security Info=false",
        ImpersonationMode = ImpersonationMode.ImpersonateServiceAccount
    });


While Model should include a list of tables, it is useful to create the table columns first, storing their references to specific variables. This makes it easier to reference the same columns in tables and relationships. (See Listing 13-8.)

Listing 13-8 ModelsChapter 13Create Tabular DatabaseCreate Tabular Database.cs


// Create columns for all the tables
Column customerKey = new DataColumn {
    Name = "CustomerKey", SourceColumn = "CustomerKey", DataType = DataType.Int64 };
Column customerName = new DataColumn {
    Name = "Name", SourceColumn = "Name", DataType = DataType.String };
Column salesCustomerKey = new DataColumn {
    Name = "CustomerKey", SourceColumn = "CustomerKey", DataType = DataType.Int64 };
Column salesDate = new DataColumn {
    Name = "Order Date", SourceColumn = "Order Date", DataType = DataType.DateTime };
Column salesQuantity = new DataColumn {
    Name = "Quantity", SourceColumn = "Quantity", DataType = DataType.Int64 };
Column salesUnitPrice = new DataColumn {
    Name = "Unit Price", SourceColumn = "Unit Price", DataType = DataType.Decimal };


Even though it is not necessary, you can create tables separately to make it easy to reference them. Every table must have a name, one (or more) columns, and at least one partition. In Listing 13-9, the partitions for both the Customer and Sales tables use the ContosoDW data source previously created in the Model object.

Listing 13-9 ModelsChapter 13Create Tabular DatabaseCreate Tabular Database.cs


// Create tables
Table tableCustomer = new Table {
    Name = "Customer",
    Columns = { customerKey, customerName },
    Partitions = {
        new Partition {
            Name = "Customer",
            Source = new QueryPartitionSource() {
                DataSource = smallModel.DataSources["ContosoDW"],
                Query = @"SELECT [CustomerKey], [Name] FROM [Analytics].[Customer]"
            }
        }
    }
};
Table tableSales = new Table {
    Name = "Sales",
    Columns = { salesDate, salesCustomerKey, salesQuantity, salesUnitPrice },
    Measures = {
        new Measure {
            Name = "Sales Amount",
            Expression = "SUMX ( Sales, Sales[Quantity] * Sales[Unit Price] )",
            FormatString = "#,0.00"
        }
    },
    Partitions = {
        new Partition {
            Name = "Sales",
            Source = new QueryPartitionSource() {
                DataSource = smallModel.DataSources["ContosoDW"],
                Query = @"SELECT TOP (1000) [CustomerKey], [Order Date], "
                        + @"[Quantity], [Unit Price] FROM [Analytics].[Sales]"
            }
        }
    }
};


You can add the Customer and Sales tables and their relationship to the model using the code in Listing 13-10. Note that you only need the two columns to create the relationship because the underlying tables are inferred from the columns.

Listing 13-10 ModelsChapter 13Create Tabular DatabaseCreate Tabular Database.cs


// Add tables and relationships to the data model
smallModel.Tables.Add(tableCustomer);
smallModel.Tables.Add(tableSales);
smallModel.Relationships.Add(
     new SingleColumnRelationship {
         FromColumn = salesCustomerKey,
         FromCardinality = RelationshipEndCardinality.Many,
         ToColumn = customerKey,
         ToCardinality = RelationshipEndCardinality.One
     });


Finally, as shown in Listing 13-11, you create the Database object and assign it to the Model property (the object populated with tables and relationships).

Listing 13-11 ModelsChapter 13Create Tabular DatabaseCreate Tabular Database.cs


// Create database
Database smallContoso = new Database("Contoso Small");
smallContoso.Model = smallModel;


The call to the Update method shown in Listing 13-12 is required to transfer the changes to the SSAS instance. If you do not call the method, the changes will remain local to the TOM library.

Listing 13-12 ModelsChapter 13Create Tabular DatabaseCreate Tabular Database.cs


// Remove and create a new database using TOM
// Connect to an existing server
Server server = new Server();
server.Connect(@"localhost ab16");
// Add the new databases to the database collection
server.Databases.Add(smallContoso);

// Update the complete database metadata
smallContoso.Update(
    Microsoft.AnalysisServices.UpdateOptions.ExpandFull,
    Microsoft.AnalysisServices.UpdateMode.CreateOrReplace);


You can also refresh the database by using the RequestRefresh and SaveChanges methods from the Model object. The SaveChanges operation shown in Listing 13-13 starts on the server that was invoked, whereas RequestRefresh prepares the request on the TOM library only.

Listing 13-13 ModelsChapter 13Create Tabular DatabaseCreate Tabular Database.cs


// Refresh the data in a new database
smallContoso.Model.RequestRefresh(RefreshType.Full);
smallContoso.Model.SaveChanges();


As an alternative to the Update operation, you can generate the TMSL script by using the code sample shown in Listing 13-14.

Listing 13-14 ModelsChapter 13Create Tabular DatabaseCreate Tabular Database.cs


// Generate and display the TMSL script
// Note: We do not execute this TMSL. We use the TOM API instead, which
//       internally communicates by using a different protocol
string tmsl = JsonScripter.ScriptCreateOrReplace(smallContoso);
Console.WriteLine(tmsl);


This option does not require a connection when you use the Server class. It can be executed or scheduled by using one of the techniques described in Chapter 7 in the section “TMSL commands.”

The tabular model created in this example is very simple, and it uses the minimal set of object properties. Depending on your requirements, your code will populate a larger number of properties for each entity.

Automating data refresh and partitioning

Data refresh and partition management are two operations that are typically automated by using one or more different tools. For a tabular model, the most common techniques are TMSL scripts and TOM libraries, called by PowerShell or managed languages such as C#. A complete list of tools and techniques is described in Chapter 11, “Processing and partitioning tabular models,” in the “Processing automation” section. Chapter 11 also includes a “Sample processing scripts” section, which includes TMSL and PowerShell examples to process databases, tables, and partitions.

When automating partition management, you must use TOM through a PowerShell script or a C# program. In the section “Sample processing scripts” in Chapter 11, you will find a complete PowerShell script to maintain a fixed number of monthly partitions in a table by removing older partitions and creating new ones automatically, based on the execution date. If you want a more complex and configurable general-purpose tool to manage partitions, consider the AsPartitionProcessing tool, available as an open source project from the Analysis Service development team at https://github.com/Microsoft/Analysis-Services/tree/master/AsPartitionProcessing. Its associated whitepaper, “Automated Partition Management for Analysis Services Tabular Models,” includes more details and best practices about partition management.

Analyzing metadata

You can iterate databases and entities in each model’s database to extract certain information about the tabular model, such as tables, measures, calculated columns, and partitions. You can use this information to customize the user interface of a reporting tool or to automate the manipulation of existing tabular models (as described in the next section). For example, the code sample in Listing 13-15 displays the list of databases on a particular SSAS instance.

Listing 13-15 ModelsChapter 13Display Tabular MetadataDisplay Tabular Metadata.cs


using System;
using Microsoft.AnalysisServices.Tabular;

namespace Display_Tabular_Metadata {
    class Program {
        static void Main(string[] args) {
            Server server = new Server();
            server.Connect(@"localhost abular");

            ListDatabases(server);
        }
        private static void ListDatabases(Server server) {
            // List the databases on a server
            Console.WriteLine("Database (compatibility) - last process");
            foreach (Database db in server.Databases) {
                Console.WriteLine(
                    "{0} ({1}) - Process:{2}",
                    db.Name, db.CompatibilityLevel, db.LastProcessed.ToString());
            }
            Console.WriteLine();
        }
    }
}


The following output shows the compatibility level and the last-processed date and time for each database on the server. (Your output will reflect the databases on your server.)


Database (compatibility) - last process
Contoso VertiPaq (1200) - Process:10/16/2016 7:24:05 PM
Contoso DirectQuery (1200) - Process:12/6/2016 3:48:38 PM
Contoso (1200) - Process:12/26/2016 10:02:58 AM
Partitions (1200) - Process:12/10/2016 4:38:41 PM
Contoso Small (1200) - Process:12/27/2016 1:19:57 PM


By navigating the Model object properties, you can retrieve tables, columns, relationships, and so on. The code sample in Listing 13-16 displays the DAX formulas used in the measures and calculated columns of the tables in the tabular model.

Listing 13-16 ModelsChapter 13Display Tabular MetadataDisplay Tabular Metadata.cs


using System;
using Microsoft.AnalysisServices.Tabular;

namespace Display_Tabular_Metadata {
    class Program {
        static void Main(string[] args) {
            Server server = new Server();
            server.Connect(@"localhost abular");

            foreach (Database db1 in server.Databases) {
                if (db1.CompatibilityLevel >= 1200) {
                    ListMeasures(db1);
                    ListCalculatedColumns(db1);
                }
            }
        }
        private static void ListCalculatedColumns(Database db) {
            Console.WriteLine(
                "List of calculated columns in database {0} of server {1}",
                db.Name, db.Server.Name);
            foreach (Table t in db.Model.Tables) {
                foreach (Column c in t.Columns) {
                    CalculatedColumn cc = c as CalculatedColumn;
                    if (cc != null) {
                        Console.WriteLine(
                            "'{0}'[{1}] = {2} {3}",
                            t.Name, cc.Name, cc.Expression,
                            cc.IsHidden ? "[Hidden]" : "");
                    }
                }
            }
            Console.WriteLine();
        }
        private static void ListMeasures(Database db) {
            Console.WriteLine(
                "List of measures in database {0} of server {1}",
                db.Name, db.Server.Name);
            foreach (Table t in db.Model.Tables) {
                foreach (Measure m in t.Measures) {
                    Console.WriteLine(
                        "'{0}'[{1}] = {2} {3}",
                        t.Name, m.Name, m.Expression,
                        m.IsHidden ? "[Hidden]" : "");
                }
            }
            Console.WriteLine();
        }
    }
}


The following output for this code sample shows the isolated rows that are related to the Contoso DirectQuery model:

List of measures in database Contoso DirectQuery of server DEVELOPMENTTABULAR
'Sales'[Sales Amount] = SUM ( Sales[Line Amount] )
'Sales'[Cost] = SUM ( Sales[Line Cost] )
'Sales'[Margin] = SUM ( Sales[Line Margin] )
'Sales'[Sales Amount Original] = SUMX (
        Sales,
        Sales[Net Price] * Sales[Quantity]
   )
'Sales'[Cost Original] = SUMX ( Sales, Sales[Unit Cost] * Sales[Quantity] )
'Sales'[Margin Original] = [Sales Amount] - [Cost]
'Sales'[Calc Amount] = SUM ( Sales[Calc Line Amount] )

List of calculated columns in database Contoso DirectQuery of server DEVELOPMENTTABULAR
'Sales'[Calc Line Amount] = Sales[Net Price] * Sales[Quantity]

After you retrieve the Model object from a Database, it is relatively easy to navigate the collections of the TOM classes, retrieving all the details about the entities you want to analyze.

Manipulating a data model

Using TOM, you can apply changes to an existing data model. For example, you might want to dynamically change measures and columns without having to redeploy the entire data model. However, it is important to remember that the user needs administrative rights in the tabular database to be able to alter its data model.

For example, the C# code sample in Listing 13-17 adds a measure to a model running on Analysis Services. (You can find similar examples in PowerShell at https://www.sqlbi.com/articles/adding-a-measure-to-a-tabular-model/.)

Listing 13-17 ModelsChapter 13Add MeasureAdd Measure.cs


using System;
using Microsoft.AnalysisServices.Tabular;

namespace AddMeasure {
    class Program {
        static void Main(string[] args) {
            string serverName = @"localhost abular";
            string databaseName = "Contoso";
            string tableName = "Sales";
            string measureName = "Total Sales";
            string measureExpression =
                "SUMX ( Sales, Sales[Quantity] * Sales[Net Price] )";

            string serverConnectionString =
                string.Format("Provider=MSOLAP;Data Source={0}", serverName);

            Server server = new Server();
            server.Connect(serverConnectionString);

            Database db = server.Databases[databaseName];
            Model model = db.Model;
            Table table = model.Tables[tableName];
            Console.WriteLine("Adding measure");
            table.Measures.Add(
                new Measure { Name = measureName, Expression = measureExpression }
            );
            model.SaveChanges();
        }
    }
}


Similarly, you can add a calculated column to a table. In this case, it is necessary to calculate the calculated column first to make it available for query. By invoking the RequestRefresh method before SaveChanges, you ensure that the two operations are executed within the same transaction. The C# code in Listing 13-18 adds a Rating column to the Product table.

Listing 13-18 ModelsChapter 13Add Calculated ColumnAdd Calculated Column.cs


using System;
using Microsoft.AnalysisServices.Tabular;

namespace AddCalculatedColumn {
    class Program {
        static void Main(string[] args) {
            string serverName = @"localhost abular";
            string databaseName = "Contoso";
            string tableName = "Product";
            string columnName = "Rating";
            string measureExpression =
                  "VAR CustomerRevenues = CALCULATE ( [Sales Amount] )"
                + "RETURN SWITCH ( TRUE(),"
                + "           CustomerRevenues >= 10000, "A","
                + "           CustomerRevenues >= 1000, "B","
                + "           "C""
                + "       )";

            Server server = new Server();
            server.Connect(serverName);
            Database db = server.Databases[databaseName];
            Table productTable = db.Model.Tables[tableName];

            Console.WriteLine("Creating rating column");
            productTable.Columns.Add(
                new CalculatedColumn {
                    Name = columnName,
                    Expression = measureExpression
                }
            );

            // Include an automatic refresh of the table
            productTable.RequestRefresh(RefreshType.Calculate);
            db.Model.SaveChanges();
        }
    }
}



Image Note

If you want to apply the changes to the SSAS instance in separate transactions, you need to call the SaveChanges method each time you want to commit to the database.


Automating project deployment

In this section, you will see how to use the TOM library to copy a database between different servers and deploy a model.bim file to the desired server and database name. These examples will show the flexibility and control of the TOM library compared to the TMSL script and deployment wizard covered in Chapter 11.

Copying the same database on different servers

The script in Listing 13-19 copies the definition of the Contoso database from Server1 to Server2, in the corresponding SSAS Tabular instances. This technique copies only the metadata, and requires a full refresh of the destination database to make the data available for query.

Listing 13-19 Script PowerShellCopy Database.ps1


[System.Reflection.Assembly]::LoadWithPartialName("Microsoft.AnalysisServices.
Tabular")
$sourceServer = New-Object Microsoft.AnalysisServices.Tabular.Server
$destServer = New-Object Microsoft.AnalysisServices.Tabular.Server
$sourceServer.Connect("SERVER1 abular")
$destServer.Connect("SERVER2 abular")
$sourceDb = $sourceServer.Databases["Contoso"]
$destDb = $sourceDb.Clone()
$destServer.Databases.Add( $destDb )
$destDb.Update( "ExpandFull" )


If you want to copy the database on the same server, you need to change its name and ID. Usually, both properties have the same value. However, changing just the Name property is not enough because the ID property is not renamed automatically. Thus, if you want to rename a database before copying it, you must assign both the Name and ID properties before the Databases.Add method, as shown in the following code:

$destDb.Name = "Contoso2"
$destDb.ID = "Contoso2"

Deploying a model.bim file by choosing a database and server name

In Chapter 11, you learned how to automate deployment to a production server using standard tools. By using TOM, you have an additional tool to automate the deployment. Using the JsonSerializer class, you can load the model.bim file into a Database object and then deploy it using the same methods as copying a database between two servers. As you can see from the C# code sample in Listing 13-20, loading the database into memory from a model.bim file requires two lines of code and that you save the Database object to the model.bim file.

Listing 13-20 ModelsChapter 13Deploy BIM FileDeploy BIM File.cs


          static Database ReadDatabaseFromBimFile(string path) {
              string modelBim = File.ReadAllText(path);
              Database database = JsonSerializer.DeserializeDatabase(modelBim);
              return database;
          }

          static void WriteDatabaseToBimFile(Database database, string path) {
              string modelBim = JsonSerializer.SerializeDatabase(database);
              File.WriteAllText(path, modelBim);
          }


Using this technique, you can read a database from a model.bim file and deploy it to a specific server. You can also change the database name by overriding the ID and Name database properties, as shown in the C# example in Listing 13-21.

Listing 13-21 ModelsChapter 13Deploy BIM FileDeploy BIM File.cs


using System;
using System.IO;
using Microsoft.AnalysisServices.Tabular;

namespace DeployBimFile {
    class Program {
        static void Main(string[] args) {
            string serverName = @"localhost abular";
            string databaseName = "Contoso from BIM file";
            string bimFilename = @"c: empmodel.bim";

            Console.WriteLine("Reading {0} file", bimFilename);
            Database database = ReadDatabaseFromBimFile(bimFilename);

            Console.WriteLine(
                "Renaming database from {0} to {1}", database.Name, databaseName);
            database.Name = databaseName;
            database.ID = databaseName;

            Console.WriteLine("Connecting to {0}", serverName);
            Server server = new Server();
            server.Connect(serverName);

            Console.WriteLine("Deploying database {0}", database.Name);
            server.Databases.Add(database);
            database.Update(Microsoft.AnalysisServices.UpdateOptions.ExpandFull);
        }
        // Insert ReadDatabaseFromBimFile and WriteDatabaseToBimFile definitions here
    }
}


You can use the same methods in the PowerShell script shown in Listing 13-22.

Listing 13-22 Script PowerShellDeploy BIM to Database.ps1


[System.Reflection.Assembly]::LoadWithPartialName("Microsoft.AnalysisServices.
Tabular")
$serverName = "localhost abular"
$dbName = "Contoso from BIM file"
$bimFilename = "c: empmodel.bim"

$modelBim = [IO.File]::ReadAllText($bimFilename)
$db = Microsoft.AnalysisServices.Tabular.JsonSerializer]::DeserializeDatabase($modelBim)
$db.ID = $dbName
$db.Name = $dbName

$server = New-Object Microsoft.AnalysisServices.Tabular.Server
$server.Connect($serverName)

$server.Databases.Add( $db )
$db.Update( "ExpandFull" )


After you load the model.bim file into memory by using the DeserializeDatabase method, you have access to the Model object and can alter any property, such as data source connections, security roles, partitions, or any other entity in the model.

Summary

This chapter discussed using the AMO and TOM libraries to administer and manipulate a tabular model and the differences in the libraries required to manage different tabular compatibility models. The TOM library provides you full control over the deployment and customization of a tabular model, which you can also manipulate offline by using the Model object. Using the examples shown in this chapter, you should be able to customize an existing tabular model or create a new model from scratch.

..................Content has been hidden....................

You can't read the all page of ebook, please click here login for view all page.
Reset