Blog Page

10 Reasons to Attend #SQLSat Atlanta

sqlsafety 0 Comments

With just over a week and a half left it is time to register for SQL Saturday Atlanta and here are 10 reasons that you should attend.

1. Rafael Salas, twitter, ETL Architecture Chekup: Avoiding Costly Mistakes This session is an encore from SQL Saturday Puerto Rico.  While at that event I told Rafael that I heard he would be presenting in Atlanta.  Rafael looked at me, scratched his head, and asked “I’m doing what?”   Max Trinidad snapped him back to reality with a good old fashioned “I told you you!!

image   

After a moment Rafael regained his composure, and memory, and put on his rally cap and told me that he was fired up to present at SQL Saturday in Atlanta.  Really…  No bull.

image

Alright…  Maybe that conversation never happened, but he did put on the horned helmet!!  If you want the truth make it up yourself.

2. Wayne Sheffield, twitter, Crazy Things That Developers Do Are you a developer and want to know what happens in Vegas after the DBA’s go to sleep?  Do you want to know how to get the hybrid on the roof after last call?

Do you want to know where to find a helmet with horns to wear at the after event of SQL Saturday?  If you answered yes to any of these questions then this isn’t the session for you.  If you want to know some common mistakes developers make and how to correct them then this is the session for you.  If you wanna find a horned helmet then follow Rafael to the after event.  You might get the helmet and you might get the horns.  Your call.

3.  Patrick LeBlanc, twitter, Introduction to Power BI Have you heard the whispers of “self-service information management”  or the ability to present users with the tools necessary to evaluate and analyze corporate data without the help or intervention of IT staff?  I know!!  As far as the legends that everyone talks about, but few have seen we have Bigfoot, Yeti, Loch Ness Monster, and now self service business intelligence.  Join Patrick as he discusses SSIM and demonstrates it in captivity.

4.  Brian Moran, twitter, Workplace Coaching: Simple Skills that Change Everything Are you looking for the skills necessary to grow, lead, and help others around grow and lead within your organization?  OK, well how about the ability to coach without using coercion or mental and emotional tactics.  C’mon now.  After 24 years in law enforcement “coaching” employees was sometimes synonymous with “tasing”.  Learn the right way now!

5. Melissa Coates, twitter, Administering & Managing the Power BI Environment After attending Patrick’s session on how to find and capture self service business intelligence, you better show up to hear Melissa discuss how to care for and manage it in captivity.

6. Kenneth Urena, twitter, Indexing strategies and good physical designs for performance tuning Did you attend Kenneth’s session at SQL Saturday in Costa Rica and leave saying “I’m not sure I understood everything”?  That is because it was in Spanish!!  C’Mon Man!!  Join Kenneth as he explains indexing strategies to improve database performance.

7. Kalen Delaney, twitter, What’s Simple About SIMPLE Recovery? True or False: Simple recovery means that SQL does not log database changes or transactions, that no database administration is necessary, and the Cleveland Browns will make it to the Super Bowl this year?  If you answered true then you need this session more than air!!  Sorry Brown’s fans.  I saw my shot and took it.

8. Louis Davidson, twitter, Database Design Fundamentals The best database design is done with friends over a bottle of wine using an Excel spreadsheet as your model right??  Try again!  Louis discusses proper database design techniques and normalization.

9. Robert Cain, twitter, Everything You Ever Knew About SSIS Is Null and Void Do you think you know SSIS?  WRONG AGAIN!!  Read the session title.  Join Robert and learn about the change in SSIS 2012 deployment model as well as new features and functionality.

10. Jeffrey Garbus, twitter, Bad, less Bad, not Bad; rewriting bad SQL Code The session title may not sound like a pep talk, but if you are looking for tips on rewriting T-SQL code you should show up.

If you take a look at the event schedule you can see that I could give at least another 20 great reasons to attend SQL Saturday in Atlanta, but my flight is landing and I am getting “the look” to put my laptop away.

10 Reasons to Attend #SQLSat Costa Rica

sqlsafety 0 Comments

SQL Saturday Costa Rica is less than one week away.  There is still time to register and if you haven’t yet here are some reasons to get you motivated.

1. Andy Leonard, twitter, Using BIML as an SSIS Design Patterns Engine If you work with SSIS and have always thought, “There has to be an easier way to quickly replicate these redundant tasks in SSIS” then this is the session for you.  Andy literally wrote the book in SSIS Design Patterns, I won’t tell you how it ends, but the hero wins after an awesome fight sequence and there is definitely room for a sequel.

2. Jose Chinchilla, twitter, Aprende todo sobre Business Intelligence con SQL Server 2012, SharePoint y Excel 2013 If you are a business intelligence developer then this session is a MUST.  The BI toolset is no longer limited to SQL, but is extended to SharePoint as well as Excel.  Need more information on BI and the tools available then show up.

3. Kenneth Urena, twitter, Estrategias de Indexación para alto desempeño Has your indexing strategy gone stale?  Then sit in to learn about indexing strategies for high performance.

4. Carlos Chacón, twitter, Judge, Jury, and Executioner—A First Look at Execution Plans Are you query plan illiterate?  Did you miss this session at SQL Saturday Puerto Rico?  Time to catch up!!  This session is the “Hooked on Phonics” for execution plan reading.

5. Rodney Landrum, twitter, Data Analytics and the DBA – Using PowerView to Uncover Interesting Facts About Your SQL Servers Looking for user friendly tools to uncover information about your data.  How about PowerPivot and PowerView?  This session will show you how to use these new tools to retrieve and display this information.

6. Eduardo Castro, twitter, Optimizar el desempeño de SQL Cloud Database Have you thought of moving SQL to Azure VM’s and are looking for some best practices?  Join Eduardo on this as well as comparing SQL as a service or infrastructure.

7. Jose L. Rivera, twitter, Table Partitioning A to Z What are some table partitioning strategies as well as improvements in SQL 2014?  I couldn’t tell you.  That’s why Jose will be in Costa Rica.

8. Tim Radney, twitter, Know Backups and Know Recovery Do you really need a back up and recovery strategy?  Not as long as you have a junior DBA to blame it on.  THAT WAS A JOKE!!  Tim discusses backup strategies and how these should work together with your recovery strategy.

9. Jose Gregorio Lopez, twitter, In Memory OLTP – Introduction  So you went to Jose’s presentation at SQL Saturday Puerto Rico a few weeks back and are following him to Costa Rica?  A little creepy and bordering on stalking.  Get an introduction to SQL’s new in memory technology

10. I attended this event last year and despite Eduardo and others getting to know me I was invited back.  That REALLY speaks volumes about patience and tolerance, you can refer to my post about my visit to support this here.  The above listed are only a few of the informative and exciting sessions that will be presented.  I am very excited and honored to be participating at SQL Saturday Costa Rica again this year and hope to see you there.

SQL Saturday Puerto Rico 2014

sqlsafety 0 Comments

WHAT AN AWESOME EVENT!!  I want to thank everyone for such an incredible experience and special thanks to Jose Rivera, Guillermo Caicedo and all the volunteers for all of your work.  Can’t wait for next year!

 

As promised my presentation materials can be downloaded from the SQL Saturday event site or here.

10 Reasons to Attend #SQLSat Puerto Rio

sqlsafety 0 Comments

We are only days away from SQL Saturday #283 in San Juan Puerto Rico!!  If you are looking for justification to attend let me help you out.

1. Bradley Ball, twitter, Inside the Query Optimizer  Want to optimize your queries and gain a better understanding of optimization rules?  Kewl!  See you there.

2. Jose Gregorio Lopez, twitter, SQL Server 2014 In Memory Column Store Indexes – Haciendo un Almacén de Datos (Making a Data Warehouse)  Have you heard about SQL Servers column store indexes?  Want to learn how they work and when to use them?  Here is your chance

3. Alan Koo, twitter, PowerPivot, Tabular and Multidimensional Models: Understanding Analysis Services Flavors.  If you haven’t heard there is a new data model in town beginning in SSAS 2012.  If you want to learn about the Business Intelligence Semantic Model (BISM) and the concepts of SSAS and PowerPivot then just show up.

4. Jorge Segara, twitter, Azure Virtual Machines: A Walk in the Cloud. Virtualization has become prevalent in almost all organizations.  If you haven’t educated yourself about Windows Azure virtual machines then stop by.  If any one is interested I would appreciate a session called “Virtual Machines, Virtual Desktops, and how you can Become a Virtual Employee”.  Just wanted to throw it out there.

5. Raphael Salas, twitter, ETL Architecture Checkup: Avoiding Costly Mistakes.  An ounce of prevention is worth a pound of cure.  Make sure that you don’t make costly mistakes when you design and build your ETL infrastructure.

6. Jorge Sanchez, Creating Reporting Services Reports to be Run in SQL Server Management Studio.  SSMS has included a number of pre-built SSRS reports since SQL 2005.  These provide some great insight and useful information, but it’s time to extend this functionality with reports of your own. 

7. Wayne Sheffield, twitter, Table Vars & Temp Tables – What you NEED to Know!  What is the difference between a table variable and temporary table?  When do I use a temp table over a table variable?  What is the answer to life?!?!  Show up and find out.  ALL RIGHT!!  Maybe I made up the last question!  No pressure Wayne.

8. Jose Chinchilla, twitter, PowerPivot + Powerview + PowerMap = Self-service BI Revolution.  Translation = “Power Session!!”  Nuff said.

9. Guillermo Caicedo, twitter, Introduction to SSIS with BIML.  SSIS is an AWESOME tool, but what if you want to automate some typical tasks?  BIML, Business Intelligence Markup Language, is the answer.

10. IT IS PUERTO RICO!!  Do you really need 9 other reasons?!?  C’MON MAN!!

A SQL Cmdlet a Day 4 SQLPS

sqlsafety 0 Comments

Over the past few days I have posted about working with SQL using PowerShell, introduction, piping, variables, Each one of these posts demonstrated working with a PowerShell session that was launched from SQL Server Management Studio rather than a Windows PowerShell prompt.  If you have attempted any of the code snippets included in the posts within Windows PowerShell you most likely would have received an error.  The reason is that launching PowerShell from SSMS opens a prompt where the SQL Server PowerShell snap-ins have been loaded and registered, while Windows PowerShell does not automatically load these.  The short answer is that Windows PowerShell does no automatically include all of the SQL modules that are needed to work with SQL Server.

To demonstrate this open up a PowerShell prompt from SSMS and let’s take stock of the modules that are automatically loaded:

[appdomain]::CurrentDomain.GetAssemblies() | SELECT FullName

image

You will notice that there are a number of assemblies that are returned, but which ones are related to SQL?  By piping the results to a Where cmdlet and then to our Select we are able to filter those modules that have “sql” within the FullName:

[appdomain]::CurrentDomain.GetAssemblies() | where {$_.FullName -like “*sqlserver*”} | Select FullName

imageKewl, but what’s the difference between this and Windows PowerShell?  Running the same cmdlet in a Windows PowerShell prompt shows that there are no assemblies that apply to our filter, having sql in the FullName:

image 

While working within Windows PowerShell consider that you want to create a variable that is the localhost default instance of SQL.  Running the below cmdlet in Windows PowerShell results in the following error:

$server = New-Object (‘Microsoft.SqlServer.Management.Smo.Server’)  -argumentlist “localhost”

image

The reason for the error is that the SQL modules references have not been loaded.  This does not mean that you can never use a Windows PowerShell prompt to work with SQL, but rather that you must first import the SQL modules.  There are several ways of doing this, but the easiest is to call the sqlps utility from within your Windows PowerShell session:

image

Running this simple command starts a Windows PowerShell 2.0 session with the SQL Server PowerShell provider and cmdlets loaded and registered, which means I can now work with this prompt the same as if I had started it from SSMS.  Something that may also jump out is the path of the prompt has changed.  Rather than being in the current user directory, C:\Users\David>, the prompt is now in the context of PS SQLSERVER:\>. 

As I said this is the easiest means, but it has also been marked as deprecated, which means that we should use our Import-Module cmdlet instead.  Several additional steps MUST be taken to successfully import this module:

1. The Windows PowerShell prompt must be run as Administrator

2. The execution policy must allow signed scripts signed by a trusted security provider

Set-ExecutionPolicy RemoteSigned.

Once that has been done you can call your Import-Module cmdlet:

Import-Module “sqlps” –DiableNameChecking

image

A couple more hoops to jump through, but the end result is the same.  We can now verify the assemblies by again querying the loaded modules:

[appdomain]::CurrentDomain.GetAssemblies() | where {$_.FullName -like “*sqlserver*”} | Select FullName

image

Now running my cmldet assigning the localhost default instance to my variable succeeds:

$server = New-Object (‘Microsoft.SqlServer.Management.Smo.Server’)  -argumentlist “localhost”

$server.Databases

image

A SQL Cmdlet a Day 3 Variables

sqlsafety 0 Comments

As with any scripting language PowerShell supports variables, but unlike many languages declaration and assignment is streamlined.  One of the coolest things about PowerShell variables is that data type assignment is automatic based upon the data type value.  Variable declaration is simply done by prefixing a “$” to the variable name.  The below statement creates a variable called “$var” and does an inline assignment of the string value “Hello World!!”.

$var = “Hello World!!”

By calling the variable within your command shell it will display the variable value.

image As I mentioned data type assignment is based on the variable value so in this case we don’t need to formally assign a data type.  The data type can be retrieved by calling the GetType() method of the variable. $var.GetType()

image To get a list of all members available we can pipe the $var variable to a Get-Member cmdlet:

$var | Get-Member

imageTo get more information on “piping” you can look at the second blog in this series here.

Since the data type of this variable is a string we cannot perform operations on it that are not supported on this type, such as incrementing the variable by one:

$var ++

image

But we can easily change the variable type by assigning a numeric value.

$var = 1

$var ++

$var.GetType()

image

Again, these examples and concepts are pretty straight forward so let’s dial it up and take a look at working with arrays.  The only difference in creating an array is in the syntax of the value assignment.  The below cmdlet creates a variable called $Computers and assigns an array of values.  The type is displayed by calling the GetType method and then the values are displayed by using the zero based index.

$Computers = @(“Server1″, “Server2″, “Server3″)

$Computers

$Computers.GetType()

$Computers[0]

$Computers[1]

$Computers[2]

$Computers[2] = “Server4“

$Computers[2]

 

image

Pretty kewl right?  But what does this have to do with using PowerShell variables with SQL?  Glad you asked!!  We’ll start with a simple example of creating a variable that will be assigned the value of the localhost default instance,  The first thing I’ll point out is rather than a direct assignment I am using New-Object command and specifying the type:

$server = New-Object (‘Microsoft.SqlServer.Management.Smo.Server’)  -argumentlist “localhost”

With the assignment complete we can now call the properties or methods available:

$server.Databases

image

This effectively demonstrates how PowerShell can be used to easily create variables of SQL objects and return information about them, but we want to be able to programmatically administer SQL right?  Consider the common and simple scenario of changing the recovery model of a database.  Obviously this can be done using SSMS, but what about scripting this out in PowerShell.  SIMPLE!!  The previous cmdlet demonstrates creating a variable to hold the localhost default instance and query its database, but this same variable can also be used to change the recovery model of one or all of the associated databases.  The below cmdlet calls the TwitterSearch database and pipes the output to a Select query to show the name and RecoveryModel of the database.

$server.Databases[”TwitterSearch”] | Select name, RecoveryModel

image

Changing the RecoveryModel is as simple as calling the property from our variable and assigning the value and then calling the Alter method to apply the change:

$server.Databases[”TwitterSearch”].RecoveryModel = “Simple”

$server.Databases[”TwitterSearch”].Alter()

$server.Databases[”TwitterSearch”] | Select name, RecoveryModel

image

This hopefully provided some useful insight into using PowerShell variables to query SQL information and apply object settings.  All posts in this series have worked directly with PowerShell from SSMS so the next post will outline using the the Windows PowerShell Window to import the SQLPS module.

A SQL Cmdlet a Day 2 Piping and Select

sqlsafety 2 Comments

This post expands on the basics of SQL Server PowerShell and introduces “piping” as well as using Select in PowerShell cmdlets, the first post can be found here.

Technet defines “piping” as “passing that object – unchanged – from one part of the command to another”.  The definition speaks for itself and accurately describes what occurs when “piping” in a cmdlet.  To demonstrate this open a SQL PowerShell window from within the databases

image

Running the dir command returns a list of the databases and some of their properties:

image

Now pipe the directory results to a select cmldet and you will see that there are a number of properties that aren’t returned with just a directory command, dir | Select *

image

Now lets refine the command to be a bit more selective and only return the database name and recovery model

dir | Select name, RecoveryModel

imageNow we can pipe the results of the select to a sort to operator to sort the results based on the recovery model

dir | Select name, RecoveryModel | Sort RecoveryModel

image

So now lets filter the results to show only the records of the databases set to Simple recovery model and then pipe those results to a sort operator based on the name in descending order.

dir | Select name, RecoveryModel | Where-Object {$_.RecoveryModel –eq “Simple”} | Sort RecoveryModel

image

It is true that there are other methods to pass objects or results such as using variables, but “piping” provide an easy means of accomplishing this and requires less code.

A SQL Cmdlet a Day 1

sqlsafety 2 Comments

Have you ever had the desire to work with SQL using a command line interface?  If you answered “Why would I ever want to do that?” I can give you several EXCELLENT reasons:

1. It’s Kewl!!

2. It will impress your friends!!

3. Chicks LOVE IT!!

ALL RIGHT!!  So maybe I’m exaggerating.  Regardless of my listed reasons, PowerShell is one more tool available to leverage management and administration of SQL.  This post kicks off a series on working with SQL using PowerShell. 

PowerShell is Microsoft’s primary command line interface for the majority of its software products and, as it is built on .NET, there is a lower learning curve for existing .NET developers.  SQL and PowerShell integration began with SQL 2008 and it provides another means of managing, administrating, and automating SQL Server.

Starting with SQL Server 2008 you may have noticed the additional “Start  PowerShell“menu option when right clicking within the object explorer:

image

Selecting this option opens up a PowerShell command line interface within the context of where you clicked in the Object Explorer.

imageThe first thing that might jump out to you is that the command window background is black and not the Windows PowerShell blue as well as the prompt shows the directory context of where the object explorer was right clicked, in this case Databases.  Obviously this is a PowerShell prompt, but it differs from the Windows PowerShell prompt in that it has already loaded the assemblies required to work with SQL Server and places you in the context of where you initiated the prompt.  This allows you to pass simple and familiar commands like “dir”, directory, which will return all the objects in that context.  The below screen shot shows that executing the “dir” command displays all the databases and associated properties from within the current instance:

 

image

Pretty kewl right!!  Well what if you want to use PowerShell to view all of the jobs within SQL Server agent?  Easy enough!  Again this can be done using command shell 101.  First we need to change the directory to the SQL Server agent jobs location.  For this specific instance I will use my “cd”,change directory, command and specify the path for SQL Server agent jobs:

cd \SQL\INSTANCENAMEHERE\Default\JobServer\Jobs

From the context of the jobs I now only have to run another “dir”, directory command:

 

image

This shows how you can use PowerShell to display objects from within SQL, but what is the real value of this?  I mean it is pretty cool to be able to do this from a command line interface, but how exactly can this simple command be used simplify administration??  Consider that your management team has request a list of all jobs running on an instance of SQL.  This could be done several different ways:

1. Open jobs in SQL agent and get a screen shot

2. Execute a T-SQL statement to return this from the msdb database and paste the results into a file:


USE msdb;
GO

SELECT name
FROM sysjobs;
GO

3. Use PowerShell to directly pipe the results to a text file

dir > d:\SQL\jobs.txt

And the results look like this:image This demonstrates a VERY basic introduction to using PowerShell for SQL Server.  My next post will examine piping cmdlets as well as using Select in PowerShell.

SSIS Do I Union All or Merge??

sqlsafety 0 Comments

The Union All and Merge SSIS data flow transformations provide confusingly similar functionality by combining result sets.  So what are the differences and when should you use one of the other?  Despite the almost identical results there are some clearly defined rules as when to use the Union All over the Merge and visa versa.  The Union All should be used over Merge when:

1. The transformation inputs are not sorted
2. The combined output does not need to be sorted
3. The transformation has more than two inputs

Probably the most notable limitation of a Merge transformation that the Union All overcomes is that a Merge can accept ONLY two inputs while the Union All supports multiple inputs.  Another detractor to the Merge is that both inputs must be sorted.  This can be done by using a Sort transformation task before the Merge or using an ORDER BY clause in the source and setting the IsSorted property and setting the SortKeyPosition of the columns in the sort order desired.  Technet outlines setting the sort order here.

Both transformations require columns in the inputs have matching metadata, for example they must have compatible data types.  This requirement is the same as any T-SQL set operator.

To demonstrate these two transformations create an SSIS package first with an Execute SQL task that will be used to populate two different tables with an identical schema to be used for both transformations:
USE tempdb;
GO

IF EXISTS(SELECT * FROM sys.tables WHERE name = ’employees’)
BEGIN
DROP TABLE
employees
END

IF EXISTS(SELECT * FROM sys.tables WHERE name = ‘customers’)
BEGIN
DROP TABLE
customers
END

SELECT TOP 10 BusinessEntityID,
     
FirstName,
     
MiddleName,
     
LastName
INTO employees
FROM AdventureWorks2012.Person.Person
ORDER BY BusinessEntityID

SELECT BusinessEntityID,
                  
FirstName,
     
MiddleName,
     
LastName
INTO customers
FROM AdventureWorks2012.Person.Person
WHERE BusinessEntityID IN(
3, 5, 8, 277, 45, 14847, 3456, 76, 9874, 15937)

  Now add a data flow task that will be used for the Union All transformations with two OLEDB data sources, one for the customers table and one for the employees.  Then add a Union All transformations and connect both sources:

image Add a flat file destination and connect the Union All transformation.

imageAdd another data flow task and again add two sources, one for customers and employees, each source should use a query with an ORDER BY clause using the BusinessEntityID,

SELECT BusinessEntityID,
     
FirstName,
     
MiddleName,
     
LastName
FROM employees/customers
ORDER BY BusinessEntityID

Add a Merge transformation and connect both sources to the Merge transformation.  An error now appears saying that the IsSorted property must be configured for the Merge transformation.  Right click on both customer and employee sources and go to the advanced editor and select the Input and Output Properties tab and select the OLE DB Source Output.  In the Common Properties pane set the IsSorted property to True:

image Once the IsSorted property has been set you must still set the column(s) that are sorted.  Open the OLE DB Source Output and the Output Columns and select the BusinessEntityID and set the SortKeyPosition to 1:

image Again add a flat file destination and connect the Merge transformation.

Running the package and reviewing the output files shows near identical results with the exception that the NamesMerged file has the same records, but they are ordered by the BusinessEntityID.

The sample package outlined can be downloaded here.


Hit Counter provided by Sign Holders