Remote SQL Server Troubleshooting and Monitoring Using SSIS Don St. Cyr (don saint-seer) Don.StCyr@gmail.com @Machiavelli_too DBADon on StackOverflow Remote SQL Server Troubleshooting and Monitoring Using SSIS Questions, don’t hesitate How many SQL instances? Simple application, simple Pkg’s, Simple process Report Spreadsheet 3 SQL Files Open C:\Temp Mod_Stats PS! Report.HTML Excel Report short name
Sponsors Resources: Professional Microsoft SQL Server 2014 Integration Services (Brian Knight, Devin Knight, Jessica M. Moss, Mike Davis, Chris Rock) published by Wrox and StackOverflow. Self Taught Tips appreciated
All About Me! DBA at ScriptPro Formerly worked as a contractor to the U.S Army and the DoD on their Force Management database. Also was a technical trainer at a hospital and a systems administrator for a school district. Pharmacy Solutions Robotic Dispensing Systems (Pill counting robot) Pharmacy Workflow SW Full pharmacy Mgmt SW DB team of 4 3k-4k accounts 1k-1.3k accounts 1.3k-1.5k SQL Instance Robust servers CS But still need current and historical data ASAP Build apply change index? Stats off? Disk issues? Latency?
Why SSIS for Troubleshooting? A project to learn on Portability Security Automation Graphical interface Simple to work with Great documentation and community Preparation for future ETL/data mining projects Easily interact with other tools (i.e. Powershell & CMD line) Why not? Goals: Portable revolving set of data DL from site E-mailable reports Compare site to site Compare current VS old New DBA, left up to me Real Reasons Add value
Project Infancy Rpt Name Query 2 Query 3 Query 7 Query 6 Query 5 Glenn Berry’s “SQL Server Diagnostic Information Queries” (https://www.sqlskills.com/blogs/glenn/category/dmv-queries/). One Package to Rule Them All! Rpt Name Query 2 Query 3 Query 7 Query 6 Query 5 Query 4 Query 8 Query 9 Query 10 Query 11 Modified versions of Glenn Berry’s Diagnostic Queries Very good queries Understand these, understand SQL Server Thx Glenn 20-30 + 5-10 Scripts per day/month All queries in 1 large package Linear process Head to Toe The first version had 25-30 separate queries
2nd and 3rd Iterations Rpt Name Query 2 Query 3 Query 4 Query 5 More eyes on Rpt, more Queries Bloated to 40-50 + 5-10 Added parallel streams to run quicker Pkg still cumbersome Deployment issues Slow running Broke Pkg into two Important stuff first Slow second Maintenance trumping Thinking to further Break things up Part 2 Query 8 Query 9 Query 10 Query 11 Query 12 Query 13 Query 14 The next few versions had 40-50 separate queries
Modular Stats The current version has 25-30 separate queries Improvements Over the Old Simple construction Flexible arrangement Individual self contained pieces can be exchanged/removed Base system runs with or without any of the individual packages *NOTE: Package name and text output file need to be named the same (see step 4 of Master Package slide) Hurdles: Create individual Pkg’s Execute all the Pkg’s Execute all Pkg’s securely Plusses: Decreased complexity Flexibility Targeted Easier deployment Specific queries to sites High of 60 to 25-30 Some queries we thought would be great The current version has 25-30 separate queries
Nuts and Bolts We create a Data_Feed table with the important information that we will need to run the packages. The PACKAGE names, the CLASS and SUBCLASS (to create the order), the SAVEDAYS (how many days to save the output), the DIRECTORY where the files get staged, a VALID_TIME and an INVALID_TIME to turn the packages off and on as needed. SQL Performance Database Stats database Data_Feed table Package and output names Class and Subclass (Dewey decimal system) Save Days Directory C:\Temp bad! Valid/invalid times turn off and on Show data_Feed SQL with update statements This will be in the downloads
Master Package SP Counts the # of packages in the Data_Feed table Selects top 1 based on Class then Subclass and then executes the package Package exports to text files and saves them in the (Directory) folder Deletes all the text files (using the Package name) older than X (SaveDays) days in the (Directory) folder & subfolders If SaveDays = 0, the SP deletes all the txt files that match the Package name dot txt Selects the next top 1 & cycles through steps 2-5 until the count is reached Last package to run is Rollup which creates the dated folder and moves the files into it Master Package SP finishes by deleting any empty folders All packages get placed in the Pkg folder C:\Temp bad! All text files in C:\temp will get moved SQL Agent executes the Master Package SP Master Pkg iterates over the data in the table Exec Valid Pkg’s in order Output to text files Save Days = 0 Pkg’s fully encrypted Master_Pkg passes in PW To keep PW safe Master Pkg also encrypted Same process can be used for data feed project or any other project that iterates over SSIS pkg’s Show Master Pkg SQL
SSIS Packages Make the individual SSIS packages however you would like I strongly encourage encrypting the packages and using the password in the Master Package SP to execute This prevents anyone else from adding in a package of their own for automatic execution Ensure the SSIS package name is exactly the same as the out file name It is important to consider data privacy and other confidential information, in our case HIPAA We avoid all PHI & PII in our reports, this info is about server health only! Currently all Pkg’s are fairly simple Coworkers without SSIS experience SSIS used to transform anything to anything Pipe delimited output Output has exact name of pkg to clean itself 60+ sites, 60 plus days, 10-15 MB
Reports Each days information is saved as individual text files and saved into a dated folder The folders can be pulled back to a system that has Excel installed and full use of Powershell I also have a report in HTML for sites where getting the data back is more difficult, but this report is currently in beta Servers, no Excel Powershell to create spreadsheet Charts and Graphs Example at end NO PHI, PII, no patient medical data HTML report via Powershell too, view onsite HTML rpt is good for DoD and other sites where we cannot remove any data
The Rollup Package Name Scope Data Type Value Expression The Rollup package uses these variables: Name Scope Data Type Value Expression Arg Rollup String /C robocopy C:\Temp C:\Temp\20170716 *.txt /mov /is "/C robocopy C:\\Temp " + @[User::Path] + " *.txt /mov /is" Date 20170716 (DT_STR,4,1252)DATEPART( "yyyy" , getdate() ) + RIGHT("0" + (DT_STR,4,1252)DATEPART( "mm" , getdate() ), 2) + RIGHT("0" + (DT_STR,4,1252)DATEPART( "dd" , getdate() ), 2) Folder C:\Temp @[User::Folder] Path C:\Temp\20170716 @[User::Folder]+ @[User::Date] Read from the slide Rollup has two 2 tasks File system task create directory Execute proc task to move the files Rollup creates the folder in year Month Day format Rollup moves the text files into the newly created folder Will also move any other text files too
The Rollup Package (continued) The Rollup package uses two tasks: The File System Task configuration as shown here
The Rollup Package (continued) And the Execute Process Task configuration here:
Demos Data Feed table Master Package SP Invalid Time Save Days part 1 Save Days part 2 Save Days part 3 Report Generation
My Next Steps (or, How I’ll do it Differently) In the Master Package, use CMD line only (for DoD and other Gov clients) [currently uses a mix of CMD line & Powershell] Export the same data to tables rather than text files and then use a new report to query the desired data from the tables Use the report (or a separate package) to create a single text file for offsite review Have a continuously updating feed of information that I can roll into my own monitoring tool
Thank you sponsors!
Don St. Cyr (don saint-seer) Don. StCyr@gmail Don St. Cyr (don saint-seer) Don.StCyr@gmail.com @Machiavelli_too DBADon on StackOverflow Resources: Professional Microsoft SQL Server 2014 Integration Services (Brian Knight, Devin Knight, Jessica M. Moss, Mike Davis, Chris Rock) published by Wrox StackOverflow