Wednesday, June 4, 2014

SQL Server T-SQL Query for Stored Procedures with Parameters

List of SPs and Parameters

I often want to get a list of all the stored procedures and their parameters for documentation purposes. The following queries work well for me. Most of our SPs have a prefix, so the queries include the ability to filter the list using a prefix, postfix, or keyword.

Using SYS Tables

I originally went down the path of using SYS tables (MSDN):

SELECT
    sp.name AS Name,
    p.name AS Parameter,
    t.name AS [Type]
FROM sys.procedures sp
JOIN sys.parameters p
    ON sp.object_id = p.object_id
JOIN sys.types t
    ON p.system_type_id = t.system_type_id
WHERE sp.name LIKE 'prefix%' -- '%postfix' or '%keyword%'
ORDER BY
       sp.name,
       p.name
;

Using INFORMATION_SCHEMA

I didn't really like having to use JOINs and looked for a simpler method. This is when I ran accross the INFORMATION_SCHEMA tables (MSDN). This resulted in a much cleaner query:

SELECT
       SPECIFIC_NAME,
       ORDINAL_POSITION,
       PARAMETER_MODE,
       PARAMETER_NAME,
       DATA_TYPE
FROM
       INFORMATION_SCHEMA.PARAMETERS
WHERE
       SPECIFIC_NAME LIKE 'prefix%' -- '%postfix' or '%keyword%'
ORDER BY
       SPECIFIC_NAME,
       ORDINAL_POSITION
;

Hopefully this helps anyone who is looking for a quick way to generate documentation on Stored Procedures with Parameters.

Monday, March 25, 2013

Get List of Functions by LANGUAGE with Basic Statistics

Have you ever needed the ability to get a list of all your database functions by language and some basic stats on the lines of code? Here's a quick little query that works on 8.4. It should work for 8.4+.

SELECT
    lanname,
    count(lanname),
    max(qty),
    ceil(avg(qty))
FROM
(
    SELECT
        p.proname,
        l.lanname,
        (
            SELECT(Length(p.prosrc) - Length(REPLACE(p.prosrc, E'\n', ''))) / Length(E'\n')
        ) AS qty
    FROM pg_catalog.pg_namespace n
    JOIN pg_catalog.pg_proc p ON pronamespace = n.oid
    JOIN pg_catalog.pg_language l ON p.prolang=l.oid
    WHERE
        nspname IN ('public', 'schema1', 'schema2', 'schema3')
) t1
GROUP BY
    lanname
;

Monday, August 20, 2012

Using Percentiles for Statistical Purposes

Sometimes when I am running Load Tests or general reports, I need to use percentiles to know how what the "feeling" is overall instead of just the average. This is important for things like how long it takes for a page to load, the duration of database queries, etc.

Here are a few simple percentiles to consider that I use often:

P10 Avg - (StdDev * 1.645)
P35 Avg - StdDev
P90 Avg + (StdDev * 1.645)
P95 Avg + (StdDev * 1.96)

Avg = Average (Mean would be even better). It does take into account that your data has a normal distribution.

Thursday, May 20, 2010

Sunday, March 28, 2010

Drupal 7 and PostgreSQL

I am in the midst of preparing for my tutorial, Realistic Load Testing, at PGCon 2010. After talking with a number of folks at my favorite IRC channel, I decided to use the Drupal 7 Project as my test case.
For years, Drupal worked best with MySQL. Not that I harbor ill will against the database, it is just my experience with databases has led me to PostgreSQL as a superior engine. That said, MySQL works great for these types of applications. The types of applications we do at Digitec tend to need a much more robust solution.
With Drupal 7, there has been a more concerted effort in giving PostgreSQL a chance at being a valid alternative to MySQL. The code changes and the model the Drupal development team has made have helped.
My hope is that by using Drupal, I will not only give a good "real world" example application, but also give back to the Open Source community through the tests and tools that will be created for the tutorial. As Drupal 7 is still in beta development, I hope these tests can help the developers look into solutions for any problems that occur. I also hope that it will help in showing that PostgreSQL is a great fit for Drupal.

Friday, March 26, 2010

HOWTO setup pgBouncer on Debian Part 1

When looking to provide database connections under a steady (or heavy) load, you will likely need to look into a database connection pooler. There are a number of good options for PostgreSQL such as pgBouncer, pgpool-II, SQL Relay, and a few others.
My personal favorite as of late is pgBouncer. Here's a few reasons why I like it:
  • a lightweight connection pooler--it is designed solely for pooling
  • low resource requirement
  • written with Python (a lot of our inernal apps use Python, so interoperability is great to have)
  • supports online restart/upgrade without dropping client connections
When setting up a pooler, you will want to have a separate server just for the pooler. Adding a pooler to the same server as the database only degrades the performance of PostgreSQL, so make sure you don't.

I use Debian as my choice of Linux distro. Unfortunately, there isn't an official package to install on Lenny. However, thanks to good ol' backports, we can find an up-to-date package. First things first, add backports to your /etc/apt/sources.list by adding the following source:
deb http://www.backports.org/debian lenny-backports main contrib non-free

Next, you will want to do aptitude update (or apt-get update)

You can learn more about how to use backports on their instructions page.

Finally, here's how to do an install of pgBouncer:
$ aptitude -t lenny_backports install pgbouncer

Saturday, March 6, 2010

EXPLAIN ANALYZE is Your Friend

EXPLAIN ANALYZE is a useful tool to help see what query plan the planner creates for any query. One tool that I have found useful for some time is depesz's online tool, http://explain.depesz.com/. Simply "paste your explain analyze plan, and see the output."