A WordPress.com site dedicated to System Center and Cloud Management

Archive for June, 2014

SCSM 2012 Dashboard by Signature Consultancy – Part 2: Requirements and Installation


In our previous post, we introduced you to the SCSM 2012 Dashboards by Signature Consultancy. In this post we will walk through the Requirements, and the Installation procedure.

Requirements

Following along with the Deployment and Configuration Guide provided by the Vendor, the use of these Dashboards has the following requirements.

  1. System Center 2012 Service Manager (SCSM) Data Warehouse feature installed/configured
  2. System Center 2012 Service Manager (SCSM) Data Warehouse Cubes that have completed processing and are ready for use
  3. Microsoft SharePoint 2010 (version Foundation, Standard, or Enterprise)

Installation Procedure

Data Warehouse Side -­ Import Files

Log into the System Center Data Warehouse, launch a browser, navigate to to http://<SSRSServerName>/Reports, and login with an account that is a Report Administrator. In my lab example, because I have a Named SQL Instance, my URL is http://scsm-dw/Reports_SCSMDW.

Report Manager

Under SQL Server Reporting Services, click New Folder, use “SCSM Dashboard” for the name, then click OK.

New Folder

New Folder - SCSM Dashboard

Click on the newly created folder.

Newly Created Folder

Select Upload File from the Main Menu.

Upload File

Select Browse, and locate the Signature SCSM Dashboard folder where you downloaded the report files to.

Browse

Select and Upload each .RDL report file.

Upload Report Files

Now that the report files are uploaded, we need to create a New Data Source for the reports to use. Select ‘New Data Source‘ from the menu.

New Data Source

On the New Data Source screen, enter the following information.

  • Name: “SCSMDashboardDataSource”
  • Data Source Type: “Microsoft SQL Server Analysis Services”
  • Connection String: “DataSource=;Initial Catalog=DWASDataBase”
  • Connect Using: “Windows Integrated Security”

NOTE: In my lab example, I am using a Named SQL Instance. Therefore, my Connection String is as follows: “DataSource=SCSM-DW\SCSMDW;Initial Catalog=DWASDataBase”

New Data Source - Details

Once you have enter all the appropriate information / selections, click the ‘Test Connection‘ button. Ensure that the connection results show “Connection created successfully“, then click OK to save the settings.

Test Connection

From the browser, in the SCSM Dashboard folder, perform the following for each .RDL report file.

Select Options (the down arrow) and choose Manage.

Report - ManageIn the menu on the left, choose Data Sources, and click the Browse button.

Data Sources - Browse

In the folder explorer, browse to the Data Source that we previous created, select it, then press OK.

Browse - Data Source - SCSMDashboardDataSource

Back on the Data Sources screen, ensure that the Shared Data Source is completed, then click Apply.

Data Sources - Apply

Navigate back to the SCSM Dashboard folder, then repeat the same steps for the other Report (.RDL) files.

After you have applied the Data Source for all the Reports, navigate back to the SCSM Dashboard folder (that contains the .RDL files), and click on one of the reports. This will launch the report, which should populate with your data.

NOTE: In my lab example, I didn’t have any data to display.

Launch Report

In the last post of this series, we will cover Publishing the Dashboards in the SharePoint Portal.

 

As always, if this post helped you in any way, and you would like to show your appreciation, please rate it and comment on it. Also, feel free to contact me with requests for future articles.

SCSM 2012 Dashboard by Signature Consultancy – Part 1: Introduction


If anyone has worked with System Center Service Manager, and in particular the Reporting, there may be a little more desired; for example Dashboards.

Here are some resources that might be of assistance:

Aside from these references, a company (Signature Consultancy) has created and released (for free) some dashboards for Service Manager. These dashboards cover 4 areas (Activity Management, Change Management, Incident Management, Problem Management).

Here is the description of the product:

The SCSM 2012 Dashboard from Signature Consultancy provides persistent visibility to hot spots and highlights in your Service Desk Processes and performance with numerous charts, metrics and analytics. 
The SCSM 2012 Dashboard from Signature Consultancy is easy to implement and use. It is intended to give you a 360 degrees view. Furthermore, it is highly customizable and provides the flexibility to edit, add, and change the entire interface, reports and charts to meet the most demanding requirements. 

The SCSM 2012 Dashboard presents various key metrics and business analytics that Executives and Analysts can use for monitoring and better visibility into the service desk operations. 

In addition to the key metrics, the SCSM 2012 Dashboard from Signature Consultancy can also show you a timeline of open service requests by day, and days range. This option will provide you with the possibility to follow your service request volume over time identifying trends and making forecasts more accurate and much easier.

With this dashboard, you can identify bottlenecks helping you reduce time to resolution, meeting SLAs therefore making a direct impact on customer satisfaction.

Here are some screenshots of the dashboards provided by the designer.

Activity Management Dashboard

 

Incident Management Dashboard - Top

 

Incident Management Dashboard - Bottom

 

If you are interested in trying these dashboards your self, start by navigating to the Signature Consultancy website, under Products > SCSM Dashboard, and click Free Download.

Free Download

After you have entered your registration information, click Submit and Download.

Free Download Registration

You will then presented with the files to download. Note at the time of this writing, the dashboard version is 1.2.

Additionally, the Vendor has informed me that they are working on an Enterprise version (though not for free) of the dashboards which will include the following additional features:

  1. Snapshot page: Provides reporting for current WorkItems within your SCSM server
  2. Five pages: Provides incidents, Problems, changes, requests, and Releases connect directly to the DWdatamart DB, no needs for Cubes
  3. Full web Interface, tablets, and Mobile Phone supports
  4. Access Permission per page

Download Dashboard Files

When you download the .ZIP file, it will contain a Deployment and Configuration Guide. This guide contains some screenshots, but not many.

I will utilize the steps provided in that guide, and include my own screenshots step-by-step. Let’s begin with Requirements and Installation.

As always, if this post helped you in any way, and you would like to show your appreciation, please rate it and comment on it. Also, feel free to contact me with requests for future articles.

My Experience With The PowerShell Deployment Toolkit (PDT) – Part 4 (PDT GUI for PowerShell Deployment Toolkit)


Recently, via Twitter, I was alerted to something new and exciting with the PowerShell Deployment Toolkit (PDT). 

Previous, I have written a series on My Experience With The PowerShell Deployment Toolkit (PDT). which continues to get hits/views every day! 

I was very excited to see a Tweet about a new PowerShell Deployment Toolkit (PDT) GUI! So, instead of using the command line process for using the toolkit, if you are more comfortable in a GUI, you now have that option!

Here is the description:

“The PDT GUI is a Graphical User Interface for the Powershell Deployment Toolkit. The original PDT is created and maintained by Rob Willis from Microsoft Corporation. The PDT GUI is created and maintained by German Microsoft Partner ‘Elanity Network Partner GmbH’ and is not an official Microsoft Product. The PDT GUI helps to create fast PDT Configuration-Files (Variable.xml) for Zero Touch System Center Deployments. PDT GUI creates and validates the configuration files for PDT, it does not alter the existing PDT in any way.”

For further information on the PDT GUI for Powershell Deployment Toolkit, see the following: http://gallery.technet.microsoft.com/PDT-GUI-for-Powershell-6908b819#content.

So, here’s a quick extension article to the ‘My Experience With The PowerShell Deployment Toolkit (PDT) series.

NOTE: At the time of originally writing this post, the current PDT GUI version available was v1.0. However, approximately a week ago the tool was updated, so some of my original article was no longer accurate. Therefore, to ensure that I provide something of value to the community through my posts, I have taken the time to update this post (even though I didn’t publish my original article to the public yet) accordingly.

So, with that being said, I will walk through my experience using this GUI for the toolkit. This is very useful for re-building your lab when you need to.

 

Start by downloading the PDT GUI. If you do a Google search for the “PDT GUI”, you will end up at the following (http://gallery.technet.microsoft.com/PDT-GUI-for-Powershell-6908b819#content).

PDT GUI 1.1 Download

Download the ZIP file and extract it, which will contain the following folders and files:

  • System Center 2012R2
  • System Center and WAP Complete
  • Windows Azure Pack
  • PDT-GUI.Export.ps1
  • ValidationInfo.xml

ZIP File Contents

Right-click on the file, and choose Run With PowerShell.

Run With PowerShell

 

In case you have not yet downloaded the PowerShell Deployment Toolkit (PDT) itself, or you have it in a different directory, you also may encounter the following message.

No PDT Workflow Message

Press OK, and you will be presented with the following License Terms of Use screen. Click “I Agree“.

EULA Prompt

The PowerShell window will then show the following, as it downloads the PowerShell Deployment Toolkit (PDT).

PowerShell Window - Download PDT

If you look back in the PDT GUI folder, you will now see the additional files:

  • Downloader.ps1
  • Installer.ps1
  • Variable.xml
  • VariableAD.xml
  • VMCreator.ps1
  • Workflow.xml

Downloaded PDT Files

Once the process finished downloading the PDT files, it will automatically re-launch the GUI script.

When the PDT GUI loads for the first time, you should see the following screen. Either click the “Select File” tab, or the Next button.

PDT GUI - About

On the Select File tab, you can either select an existing Variable.xml file to use, or point to your own customized version. If you are selecting an existing file, you have to click the Open button to load the file, and the click either the “General” tab, or the Next button.

PDT GUI - Select File

On the General tab, you can provide the Product Keys for System Center, and SQL Server, provide information about the installation Service Account, directory paths for the installation files, etc. As stated on this screen, the variables/fields that are required are in bold; namely: “Installer Service Account”, “Installer Service Account Password”, “Source Path”, “Registered User”, and “Registered Organization”.

When you have filled out all of the variables/fields you want/need, either click the “VMs” tab, or press the Next button.

PDT GUI - General

On the VMs tab, you will notice 3 sub-tabs; “Domain“, “Default VM Settings“, and “VMs“.

Starting with the Domain tab, you can select the checkbox “Create a new AD Forest“. If you select this, you will then be able to provide the Name, Service Account OU, and the Group OU. Don’t forget to click Save Changes once you complete these fields, then click on the “Default VM Settings” sub-tab.

NOTE: If you click the “Next” button, this will bring you to the “SQL” tab, and not the “Default VM Settings” sub-tab.

PDT GUI - VMs - Domain

On the Default VM Settings tab, you will see a list tree with all of the default settings you can configure for each VM. Note the message at the top of the screen which says “Specify the Default Settings for all VMs generated, switch to VMs tab to set individual settings per VM.”

The fields of importance in this screen are as follows:

  • Host: This is the your physical server that is hosting/running all of the VMs
  • VM Folder & VHD Folder: This is the directory that will hold all the VM related files. By default this is set to C:\VMs, so if you have a dedicated drive to host your VMs (instead of hosting them on the same volume that is running the Host OS), make the change here.
  • Network Adapter > Virtual Switch: Ensure that you already have a Virtual Switch created for you environment, and change the value here to match. By default it is set to “CorpNet01”.
  • Network Adapter > IP > Prefix: If your virtual network will be using something different other than 192.168.1.x then change the value here
  • Network Adapter > IP > Gateway: Similarly to the IP Prefix, if you Gateway is not 192.168.1.254, then change it here. Note that even if you change the IP Prefix (and save that change); for example to 192.168.2.x, the corresponding Gateway and DNS IP values do not update to reflect this (i.e. they remain at their default 192.168.1.1 and 192.168.1.4 respectively, instead of changing dynamically to 192.168.4.x).
  • Network Adapter > IP > DNS: Identically to the note above about the Gateway, change the value for your DNS as required.
  • OS Disk > Parent: If you used the Convert-WindowsImage.ps1 script (as described in this post: https://adinermie.wordpress.com/2014/01/26/my-experience-with-the-powershell-deployment-toolkit-pdt-part-2-vmcreator-ps1/), then you will need to change this value to point to the location of your .VHDX file.
  • Join Domain > Domain: You will need to provide your Domain Name in this field. Note, even if you selected the option (on the Domain tab) to create a new AD Forest, and supplied a Name for this, this value is not dynamically updated into this field.
  • Join Domain > Credentials > Domain: You will need to enter the Domain Name again in this field, as similarly stated, the field is not dynamically populated with changes you’ve made previously.
  • Join Domain > Credentials > Username & Password: Provide the Service Account credentials that will be used to join all the VMs to the domain. Note that the password is in plain text.
  • Administrator Password: Provide the password for the default Administrator password. Note that the password is in plain text.

After you have  completed all the fields you would like to modify, don’t forget to click Save Setting after each modified field. Then click on VMs sub-tab.

NOTE: If you click the Next button, this will bring you to the “SQL” tab, and not the “VMs” sub-tab.

PDT GUI - VMs - Default VM Settings

On the VMs tab, you can modify the VM-specific settings to be different than the defaults provided. The settings that you are able to modify will vary depending on the role of the VM.

For example, if I select any of the Database VMs (labelled DB##), I can change the Memory settings, and the Data Disks.

NOTE: The DB02 lists the modifiable Settings in reverse order compared to all the other Database VMs (i.e. DataDisks then Memory, instead of Memory then DataDisks).

If I select the VMM VM (VMM01), I can change the Memory, but if I select the Orchestrator VM (OR01) I can only change the VM Name. Similarly, for the SCCM VM (CM01) I can additionally change the Data Disks. For the Service Manager portal VM (SM03), I can change OS Disk > Parent property (as it requires Windows Server 2008 R2 and not Windows Server 2012), and the Network Adapter > Identifier property. Finally, the DPM VM (DPM01), there are 2 sets of Data Disks, both sets containing 4 disks at 100 MB each (by default).

So there are a lot of settings/customizations that can be made. Make the changes required, remembering to click Save Settings after each modification, and then either click on the “SQL” tab, of click the Next button.

PDT GUI - VMs - VMs

On the SQL tab, we have all the settings for each of the SQL Datatbase VMs. This screen shows all of the SQL installations that will occur, along with the Instance information. You will notice that all installations of SQL Server will be using the SQL Server 2012 version, and will all use the same Instance name of “MSSQLSERVER“.

If you select one of the servers from the Instances list, the Instance Variables area will update to reflect that Instance’s settings. Of interest, because I changed the Domain Name, not on the VMs > Domain tab (though I changed it there as well), but on VMs > Default VM Settings > Join Domain > Domain (I changed it to Test.com), the dropdown list containing the server name (above the Save Changes button) now shows the proper FQDN for the VM Name (i.e. DB01.Test.com, instead of the default DB01.Contoso.com). If I select it from the dropdown list and then click Save Changes, the Instance list will be updated and show this change immediately.

Unfortunately, even though the Instance information is updated in the list, this does not change the Instance Variables fields to reflect the newly referenced Domain Name. Note: I will contact the developers and provide this feedback as a suggestion for improvement.

For now, once you have modified all the SQL Instance information required, click on the Roles tab, or click the Next button.

PDT GUI - SQL

On the Roles tab, there are 3 areas that are displayed: “Components“, “Role“, and “Components and Roles that will be installed“.

The Components section lists all of the Components available to be installed. This includes all the System Center 2012 SP1 components (including the Service Provider Foundation), all the System Center 2012 R2 components (including the Service Provider Foundation, Service Management Automation, and Service Reporting), along with a few others like the Windows Azure Pack, Windows Azure Services, and both required versions of SQL Server (namely 2008 R2, and 2012).

When you select a Component, the Roles area will populate with the various role within that component.

For example, selecting the System Center 2012 R2 Virtual Machine Manager component, you will see the 8 roles associated with that product, including the ability to make an Active-Passive cluster of the Management Server!

Also of interest, if you select a Role that has “Database” in its name, the Server list is filtered to only show/display the VMs that will have SQL installed (which corresponds to the Instances list on the SQL tab).

After selecting a Component, and a corresponding Role (of which you can only select one at a time), you will need to specify which Server to install it onto. If you have made a change to the Domain Name (presumably from the VMs > Default VM Settings > Join Domain > Domain property), the Server list correctly shows/reflects the Domain Name changes made.

When you click the Add Selected Role, the tool will validate what other system/product is already going to be installed on that specific server, and produce an error message if Role combinations are invalid.

PDT GUI - Roles

Role Combinations Are Invalid

Further, once you have added a Component/Role combination, or if you just select one of the existing ones, you will be able to modify the application specific settings. Review the settings (which again, don’t automatically update to reflect the Domain Name changes), and don’t forget to click Save Values, then either click on the Finish tab, or click Next.

Component_Role Settings

On the Finish tab, there are 3 sections: Downloader, VMCreator, and Installer. These correspond to the PowerShell Deployment Toolkit (PDT) scripts. If you are interested, see the beginning of this series for a walk through on the XML files and each of the scripts.

You are going to run each in succession. The Start Downloader option will download all of the source files required. The Run VMCreator option will, of course, create all the VMs required for deployment. And the Run Installer will perform the installation. It is recommended to run the Validation Process first, so that the source files, and VMs are validated to ensure there is no issues. You wouldn’t want to end up with a partial deployment.

PDT GUI - Finish

This is definitely a useful tool, especially for setting up a custom lab quickly (as in my experience it completes within a few hours), or even in an actual Production deployment, as deploying System Center utilizing the PowerShell Deployment Toolkit (PDT) is fully support by Microsoft, since it is just utilizing the scripting abilities of the individual products.

 

As always, if this post helped you in any way, and you would like to show your appreciation, please rate it and comment on it. Also, feel free to contact me with requests for future articles.

Tag Cloud