As a database administrator, I have encountered many occurrences in which a business user has asked to provide the number of rows for tables within a database. If you haven’t been asked yet, I’m sure the time will come. When it does, I have a script that you can add to your toolbox that will allow you to fulfill the request!
SQL Server 2012 introduced a feature called project parameters. Of course, like many other developers, I was stuck on not fixing something that was not broken. Over time, I learned that project parameters are very beneficial features to use, especially when moving a package from DEV to QA and finally to PROD. Let’s take a quick look at how to set up package parameters and how to use them to manipulate connection strings at runtime.
A frequent issue that I’ve encountered while performing an installation of a SQL Server failover cluster is “The cluster resource ‘SQL Server (MSSQLSERVER)’ could not be brought online due to an error bringing the dependency resource ‘SQL Network Name (SQL2012CLS)’ online.” Upon checking the cluster events in the Failover Cluster Manager, you will find the below error.
One of the most critical initiatives for any organization involves building a business intelligence infrastructure and solution. Before embarking on this endeavor, it is key to put the proper resources in place for a successful business intelligence implementation and evolution.
Can Microsoft SQL Server and Informix DB2 environments integrate together? The answer is YES!! I have received an increasing amount of questions concerning wanting to cross platform ETL development work between the two. Driven from these questions, I want to dig deeper into regards to manipulating data between Microsoft SQL Server and Informix DB2.
I support a system that uses third-party software. After a recent application upgrade, I began receiving sporadic 8623 errors. These began just once every few days and quickly escalated to 3-4 per day. The error I was receiving was 8623, Severity 16, State 1.
A couple weeks ago, my colleague, Brandi Dollar, wrote a blog post about SQL Server transaction log basics. Her post is a great lead-in to a script that I wrote to solve a common problem, high VLF counts. An important piece of managing your database transaction logs is keeping the number of virtual partitions within the log file, the Virtual Log Files (VLF for short), low. A high VLF count is typically a result of running with the default auto grow settings. As the transaction file continues to grow at sub-optimal levels, the fragmentation will become worse and worse. High VLF counts can have an impact on several performance issues.
If you’re in a pasture and hear hoofbeats…it’s probably not a zebra.
DBA’s tend to get pulled in lots of different directions in a company…performance tuning, database design, systems administration, networking, report writing; the list goes on and on. With all of the diverse tasks that a DBA must tend to every day, sometimes we forget the very core responsibility of a database administrator: protecting data. One of the fundamental aspects of protecting data is planning for disasters.
For my first blog post, I decided to write about a cool little project that came across my desk a few months ago. The request was to create a SSRS report that could be used to insert records to a database by supplying the user with dropdown parameter values from a list of tables. With a simple stored procedure, you can easily set up a report to insert records to a table, but there was an added requirement for the user to be able to select the site (database) that the values should come from. As the user needed to be able to select the site first, the rest of the parameters needed to be set up depending on which value they selected. You can set up the data sources dynamically using a couple different techniques, but I’ll explain how I approached it.