SQLPam's Blog

January 27, 2013

SQL Saturday Tampa #192

Filed under: Community,SQL Saturday — sqlpam @ 1:17 pm

It is that time of year again.  This is the sixth year I have geared up for a new SQL Saturday in Tampa.  I am very excited this year.  I hope you are planning on attending.  (register

For the last 4 years we have been generously sponsored by K-Force at their national headquaters in Ybor City.  They bent over backwards in an attempt to make us feel at home.  As a result, we have grown to the point we have out grown them.  My deepest gratitude to K-Force for their continued support of the technical community. 

This year we are moving on.  We will be hosted by Hillsborough Community College at the Ybor Campus.  They are graciously supplying us with a number of rooms which will allw us to bring our attendees more session selection than ever before.  The rooms will be split between two buildings.  We are just a few blocks away from K-Force – so no need to learn a new area.  I am hoping our attendees will enjoy our new location.

The new location has a number of benefits for us.  First, we have more rooms which means more sessions resulting in more options for our attendees.  Next, we are able to open our sponsor pool to local recruiters – something that was not allowed at K-Force for obvious reasons.  My hope is this will take some of the pressure off from our nation sponsors who are starting to feel spread pretty thin.

We will also be hosting 3 pre-cons this year!

For the DBAs: SQL Server Internals from the practical angle by Dmitri Korotkevitch

The session covers how SQL Server stores the data and works with indexes; how to design efficient indexing strategies; how different database objects types implemented internally and what are pros and cons of them; explains why do we have locking and blocking in the system and how to deal with concurrency issues and, finally, shows a few methods that can help with performance troubleshooting of the system.

For the BI enthusiasts: Taking the Next Step with Reporting Services by Jessica Moss

In this full day session, Jessica teaches you how to enhance your basic reports to make them more appealing, flexible, and user-friendly.  You will learn how to create advanced reports using expressions and actions, and finally, you will learn how to use best practices to administer your reporting solution with ease.  Jessica’s extensive industry experience will be apparent as she shares best practices and case studies while improving the AdventureWorks sample reports as her examples.

For Professional Development: Creating Your Best Technical Presentation: A Speaker Workshop by Buck Woody

This is a workshop-style session. That means that each attendee will be required to prepare and deliver a mini-session that will be refined throughout the day. Laptops and Internet Connectivity is required.  This is very hands on – so rgistrations on tis session are very limited.

Register today to assure your seat!

December 14, 2012

Suppress zeros & retain fixed decimals when exporting to Excel

Filed under: SSRS — sqlpam @ 8:42 am

I have recently been working on an interesting project where we dynamically build data for export to excel using SSRS.  The spec was a bit crazy sounding.  We have a bunch of data that we need to export on a nightly basis – but the definition changes on regular basis and by the way – we have an unknown number of sheets with unknown formats that need to be consolidated into a single report for export to excel.  The only definite I was able to establish was that no individual sheet has more than 100 columns.  That did help.

We were able to actually pull this all together with relative ease until it came to formatting.  I will expand on the character versus numeric data and the dreaded green triangles at another point.  For today I would like to touch on decimal formatting for numeric data.

Here is the dilemma we encountered.  We have both a numeric and character column for each of the 100 columns we are working with.  However, the numeric data could be int, bigint or decimal.  Luckily, the only decimal format was limited to 4 decimal places – or the solution would have been more complicated.

In the value expression for the numeric column, we were already evaluating the passed data type and converting the value accordingly using CINT() or CDEC().  I was able to open the code for the RDL and globally change the value expression to multiply the CDEC() value times 1.000.  It was a suggestion I found in a related post.  I am not sure if it was really needed – it did not hurt – so I kept it.

The next issue was the format string.  I was able to easily apply a format string to each numeric column using  “#.####” with the hope that this would both zero suppress and force 4 decimal places.  My hopes were dashed.  I did achieve the zero suppression – but decimal places were not to be found.  However, this did give me a convenient means of globally changing all when I found the solution.

The solution was a conditional format.  I was lucky, by converting the INT data – it retained 0 decimal places so I was able to use the format string:

=IIF(Me.Value=0, “#”, “F4”)

I was not required to touch each column manually by using the Me.Value – so a global replace in the RDL handled this for all 100 columns.  The F4 translated well to Excel.

May 9, 2012

Sad Day

Filed under: Personal thoughts — sqlpam @ 7:24 pm

We knew this day was coming since near the beginning of the year. We said goodbye to Ziggy, our beautiful, deaf cat. We have had him for almost 12 years. He and Ryan grew up together. He was Ryan’s constant companion and sometime play mate. They were often caught playing tag with Ryan having the scars to prove it.

Bobby brought home a spindly white kitten to be the Goober to our black cat – Boog. Boog was highly offended that we brought this upstart into her home. I don’t know that she ever got over it. After seeing his eyes though – he became Ziggy after David Bowie’s character Ziggy Stardust. He had odd eyes – one blue the other golden/green.

Ziggy seemed extra calm – especially considering the loud noises that were going on with a four year old in the house. We had him almost a month when we discovered he was deaf. Boog took off as soon as the lawn mower was started. He only ran with the gust from the blade startled him.

His deafness made it hard to train him. It also seemed to make him grumpy. His would swat anything that irritated him – which was most things. Ryan saw it as a game – so life was pretty good. He stayed indoors, slept in our bed and got big – twenty pounds big.

Over the holidays we noticed what we assumed to be a cyst. By the time the time our schedule allowed for a vet visit – it had grown. By the time surgery could be arranged – his vet said it was too big. We found another vet. Unfortunately, the surgery revealed that it was cancer and our budget did not include room for radiation treatments at 5K a pop.

The growth came back with a vengeance. He was starting to have trouble walking and showing signs of bowel incontinence. Ryan made the call to see the vet today. But as chance would have it – he seemed to be doing much better today. We found that the tumor had ruptured and was oozing out to relieve pressure. After agonizing over the decision, we said goodbye. The vet did not push and showed the patience of a saint.

We will miss him – grumpy Gus that he was. I don’t think he knew he was a cat. He would wait by the door for us and meow loudly to greet us home. I think that is what I will miss most – he was always glad to see us get home -even if it was that he hoped we would feed him something extra. Good bye old friend – you will be missed.

May 2, 2012

Speakers with too many choices

Filed under: Community,SQL Saturday — sqlpam @ 11:39 am

I just got thru reading Eddie Wuerch’s ( B | T ) blog on Cross-Submitting to SQL Saturday Events. I consider myself both a seasoned organizer and speaker when it comes to SQL Saturdays. I do understand both sides of the event very well – which is why I am thrilled every time I am selected to speak. There are choices the organizer has to make and some of them are very difficult.

When selecting sessions, I usually go for a 2 step process. The first step is to narrow down the speakers. This has only been truly painful one year where I had 55 speakers submit and only 42 slots. Sending out the “sorry you were not selected but may be an alternate” emails really hurt. Once I select the speakers, I send an email that they have been selected. If they only submitted one session they know what they are speaking on immediately – otherwise there is a delay. I ask them to commit at this point and only assume the commitment after I receive the response. I had one exception to that – but that won’t be applied next year. In my mind, until there is a 2 way commitment – there is no contract. These speakers are doing this out of the kindness of their hearts at their expense. If I do not get a commitment – they do not go on the schedule and someone else may be added or another speaker gets multiple sessions.

I now know who my speakers are going to be and can organize them into a schedule. My speakers also know – so they can get travel scheduled early enough (hopefully) to get a break on the cost of travel. They may not know what they are speaking on yet – but they know they need to arrange to be here. I now have a little breathing space with the schedule.

This is where the time consuming piece is for me. It is a good thing to have choices but sometimes the number of choices can be overwhelming. As part of the “You have been selected” email, I ask my speakers if they have time constraints on when during the day they can speak. I have a number that need to either leave early in the day or will arrive late in the morning. They are paying for their travel – so whatever it takes works for me. I found this avoids a lot of last minute changes to schedules. I also have a little time to get the local group involved in making some of the choices.

Once the final decisions are made the speaker gets a notification of the session selected. If they need to cancel – I hope and pray they cancel before everything goes to printing. I usually at this point just ask someone already on the schedule to add a session – I usually don’t get complaints about that. Cancelling the day of the event is where I get frustrated. There are legitimate reasons – death in the family, car accident, or travel delays are several that have actually happened. I have had no shows with no notification or inadequate notification – these are issues that place the speaker on a special list for future events. This is not to say they never have a chance at any of my future events – just that a backup will be in place ahead of time if they are selected.

So for me – if a speaker lets me know early in the process – it is not an issue. It would be nice to know which speakers are cross-submitting so that I can be prepared for a back out. But having them back out before the final session selection does not hurt my feelings or my pride. This is about community. If they are backing out of my event for another SQL Saturday – the community is still winning. As long as I know before the schedule is printed – it is no skin off my nose.

Having so many SQL Saturdays that this is becoming an issue for some is not a bad thing. Sometimes it feels like we just need more Saturdays in the year.

 

April 18, 2012

SQL Saturday #111 recap

Filed under: Community,SQL Saturday — sqlpam @ 2:50 pm

I was very honored to be accepted to speak at SQL Saturday 111 in Atlanta. There were so many submissions and such well-known names that I was grateful to be included in such company.

I took Friday off – since this was supposed to be a shoe string trip. I left early and drove up stopping at my to visit a longtime family friend in Buford – about 45 minutes from the event. I had a nice visit and changed into SQL Saturday gear for the speaker dinner. My GPS got me there with no mishaps.

I got to the dinner and there were so many familiar faces it was hard to know where to start. I won’t even try to ID them all here. I got my hugs from Karla (Twitter ), Stacia ( Twitter ), Jessica ( Twitter ), and Aaron ( Twitter ). The room read like a who’s who list for the SQL community. I sat down to a delicious dinner sharing the company of Stuart ( Twitter ) and Laura – a volunteer for the event. We discussed the in’s and out’s of organizing a SQL Saturday and family while enjoying a sumptuous dinner. I had prime rib – but it was a tough decision.

After eating, I touched base with a number my SQL Family – so many I usually only see at Summit. It had been a very long day. Although the party moved on to parts untold here – I opted to go straight to the hotel. I still had to pull together last minute stuff for my presentation.

The next morning, I arrived a little late, checked in and decided to check out the sponsors. Some were old friends and a few new friends before it was over. As a SQL Saturday organizer – sponsors are very important to cultivate. They are the backbone of the event. So I make sure they know I appreciate the efforts they are making. Atlanta had a wonderful setup for the sponsors – they had 2 rooms with snacks to draw the traffic and room to move around. Have I yet said I was envious of their venue? For the record, I would love to have one like it here in Tampa. I got tied up meeting with sponsors and new faces that went with well-known names so I ended up missing the second session. But it was such great SQL family going on I was OK with that.

I had lunch with some friends and got everything I needed together so I was ready for the next presentation and then mine. I sat in on Andy Leonard’s ( Twitter ) session on SSIS Framework. He blows me away with his ability to make it seem simple while all the time showing humility. He is one in my SQL family that I really look up to – there are few that I really admire. Thanks Andy for setting a wonderful example.

Then it was time for my session. It had been discussed earlier that this was the dreaded time slot. Everyone was settling into a stupor with lunch hitting bottom and the previous All Star time slot finishing up. I had prepared. I had even presented the session twice before. Both times I felt good after the presentation was over. But this time nothing flowed easily. The timing seemed awkward. The material was presented – I got a number of thanks afterward. Several who had seemed bored in the session actually let me know the parts that were aha moments for them. It might have been that I was so tired – whatever it was – I was glad it was over when it was over. I am going to have to spend some real time with this session to make it more comfortable.

Moving on – I was glad to attend Jessica Moss’ presentation on Report Parts. I learned a lot. I have not done any Report Builder. But it is something I need to get familiar with so it is an option I can offer my clients. I am impressed with Jessica. We both took a class on presenting way back when – prior to SQL Saturday #1. I have watched her grow and far exceed any level I aspire to achieve. She engaged the room and knew her material inside out. It was a pleasure to attend her session.

That was the last session so we all moved outside. The sponsors pulled raffle tickets. Many of us were very thrilled to see so many volunteer shirts running up to claim prizes. They deserve more than most to take something tangible home from the event.

We moved to the after event. I was pleasantly surprised at the number of people that showed at the after party. In thinking back – the die-hard partiers were those that had to head back to the hotels. I think Aaron was the only local to hang out to the next location – what a trooper?

In all it was a wonderful day. I learned a lot. I met many new people – some I have followed or seen on Twitter – while others were brand new to me. I got to connect with cherished friends and SQL Family I usually don’t see outside of Summit.

A special thanks to Audrey Hammonds ( Twitter ) for heading up the organization of SQL Saturday 111 as well as the entire team of people who worked to make this happen. Thank you one and all.

 

April 11, 2012

Monster Reports – Part III – Disparate Controls

Filed under: Presentations,SSRS — sqlpam @ 8:29 pm

This – as the name states – is Part III in a series on My New Presentation – Taking the Scary out of Monster Reports. Previously, we discussed that there are 2 types of monster reports: Disparate Data and Disparate Controls. This article will discuss Disparate Control reports, some of their challenges and some ways to get around them.

With this type of report, I usually start by defining the components that will go on the report. If the user has supplied a detailed version of what they want I know how to proceed. If not, I need to ask myself and sometimes the user, some of the following questions:

Is the report portrait or landscape?

  • Will I need more than one page for the report?
  • How many columns are needed across a page?
  • Can the different components be sized close to the same width? There might be different column widths on a page allowing you to group the components into the columns by width.
  • Do the different components have a relatively fixed height or will they grow with data? Top X reports are relatively fixed – as are set aggregates. A list based on parameters may grow beyond your page.

You may need to work with the user to come up with a clean design. Knowing the column widths up front is critical for a clean report before you start the next step of building the individual components.

I do recommend building the different components individually. This does three things:

  1. It allows me to work in a more controlled environment without worrying about messing up something else.
  2. It adds to the catalog of reports I can offer my users.
  3. It builds a “library” of reports I can pull onto my Monster reports.

Item one is really big on my list. It really pains me when I have a report close and then mess it up by adding something that doesn’t work. By breaking them out individually, I get the luxury of a clean sand box. I know that what I am attempting is possible. Remember that some things are easier to accomplish than others. This is where we find that out.

Item two tends to be high on my clients list. In addition to the Monster report, my user often wants to be able to print the individual components. I will often place these in a subfolder and adjust the parameter settings to enhance the user experience. If I am implementing the reports on the Monster report as sub reports, these are not the deployments I access. Those are usually deployed so they are not visible to the average user. This allows the user to share a small portion of the Monster report without exposing everything. It is a nice bonus for them.

Item three is my primary reason for breaking these out into individual reports. Basically, I am creating the Lego blocks I will be using to build my report. If I need a red block – I can easily grab it and implement. It has my data sets and layouts to make my life so much easier. Most importantly – I now know they actually work.

Now that I know what components I need, I need to determine how I am going to fit them together. I have used two methods primarily. Sub reports called from a table on the Monster report or including everything on the base layout.

A table holding sub reports is usually the easiest method. There are two things to consider. If the heights of the controls are not consistent there will be gaps when one control takes up more vertical space than the other controls on a row. Remember that SSRS does not allow you the luxury of spanning rows. But let’s assume that all your components are the same size. We need to handle the parameters. When calling sub reports, the parameters are expected to come from the table’s defined dataset. In our case, we need to create a data set that defines our parameters as the columns of the main grids data set. Now we set for a fast build. I usually define one sub report and copy it to the other cells. I then make changes based on the component’s parameter needs. This makes pulling it together very fast and very easy. It just needed a little prep.

The other method is a little more solid – less exposed blank space. I start by adding each individual dataset on the report. If I am reusing a component with different parameters – I need a multiple datasets. At this point I move to the layout where I can start dropping copies of the original components onto the report.

If I drop the components directly onto the base layout, I will encounter the same blank space issue as I had with the table. However, if I place rectangles on the base report and drop a copy of the component into the rectangle I have fewer issues with blank space. The rectangles allow each “column” to grow vertically independent of the contents of the other rectangles. The rectangles can also help set the page breaks if needed.

Don’t get me wrong, Monster reports are seldom simple. By with previously outlined steps, I hope you find them less “Scary”.

You find the files associated with this preasentation at: http://sdrv.ms/II1Lcn.

April 1, 2012

Monster Reports – Part II – Disparate Data

Filed under: Presentations,SSRS — sqlpam @ 2:23 pm

This – as the name states – is Part II in a series on My New Presentation – Taking the Scary out of Monster Reports.  We discussed that there are 2 types of monster report Disparate Data and Disparate Controls.  This article will discuss Disparate Data reports, some of their challenges and some ways to get around them.

When I get a request for one of these reports, I envision someone sitting there with stacks of reports, a pair of scissors and lots of tape.  More likely – it was tons of Excel spreadsheets and utilization of the cut and paste options – but the effect is the same – they took the totals from a lot of reports and smashed them together into a single report.  I can understand why they want them – but the expectations that that these smash up reports will run as fast as one of the original reports and not tax the system in the least that usually gets to me.

So let’s state the obvious – in a perfect word – the data for these reports would all come from cubes and we would be able to easily pull this data using MDX.  That world does exist for the rare few – but not for most of us – so we get over it and roll up our sleeves.

The trouble here is that now we have to run the queries that created each of those spreadsheets we saw earlier.  If we are lucky – we have the resulting spreadsheet to work from.  It holds the clue to how our report will work.  If not, it might save some time to build that spreadsheet – not really the data – but the layout.  It will tell us what groupings we need, the row layout for each grouping, as well as the formatting of each line.

So for disparate data reports – it is all about the data – more than anything else.  So we need a- for the most part – a single stored procedure to return all the data for the main grid on the report.  I usually have to play with how this will happen at each client’s site – because everyone’s data is different.  If the data is not massive – CTE’s may be the best solution available, other might need table variables or even temp tables.  The old standby – “it depends” is very much in play here.  But need is the same – accumulate the data into a single query-able source.

It never fails – I get everything into the ONE SOURCE when the client comes back to me and says – I have a few minor changes; “I need to change the order on some of this and add the following…”.  Let’s just plan on this happening up front – because it will.  Let’s add a “control table” to the mix.  This is a table that contains data about the different report data that will be displayed on the report and the attributes of each of these data areas.  This is my blue print to the report and it is usually a work in process thru out the life of the report.

For my demo, to promote self-containment – my control table is actually a CTE.  I find a table infinitely easier to work with – but the demo has other needs.  The control table will serve a multitude of purposes.  It will control the order of the data being displayed, the format it is displayed into and as documentation for anyone coming behind me on the report.

In the monster stored procedure, for each queried bit of data, I include a short description of the data being collected.  This is in verbiage – because it is easier to relate than numbers – but again – I keep it short.  This is going to be my link between the stored procedure data and the control table.  I will have a more verbose description that gets displayed to the end user – that can be easily changed on a whim.  I will have a number that represents the order that the data is displayed.  From here the sky is the limit – anything I need to track can be included here.  My format is usually a must, but I often include source data information and tool tips.  Note that multiple fields may be required for tool tips if they are verbose because tool tips do not wrap.

The final SELECT statement in the monster stored procedure joins the collected data to the control table and imposes order while adding the elements need for the report.

In the report itself, the source data set will be based on our monster stored procedure.  We will add the groupings with headers and footers as needed.  We may need multiple detail/header/footer rows to accommodate the multiple layouts involved.  The control table will provide the key as to which will be visible for each row displayed.  A critical note – if this is being exported to Excel prior to 2008 R2 – the multiple layout rows need to be kept to a minimum as each row is visible in the Excel export.  The control table can also provide the extras such as tool tip verbiage.

So for me – the control table is a must have when dealing with disparate data reports.

The next in the series will be dealing with Disparate Controls.

March 30, 2012

Proud Momma

Filed under: Personal thoughts — sqlpam @ 1:14 am

Today was a rough day and very strange – full of lots of twists. I did not get more than a couple of hours sleep last night while staying at my mother’s house. No reason I could put my finger on – sleep was just elusive. So the morning was not welcome. I was at Mom’s to drive her to a dental appointment where she was to have some oral surgery. I live 2 hours away and somehow – I am the only one able to reliably drive her to these things. Her friends her age scare her – so she looks elsewhere – which means me. I am grateful to have my Mom. I lost Dad back in ’94 – so I smile and thank the Lord I have her in my life.

We ate, got ready and drive over almost an hour’s drive. I hoped I might sleep in the foyer while she had her procedure – but no – everyone in the room had out their phones and were doing something – all with touch noise confirmations going. So there were clicks from one of the phones, dings from another and some strange hiss from the third. In the meantime my stomach was telling me it was not happy with the recent drug overdoses of antibiotics for the ear/nose/throat infection I was trying to get over.

They all got quiet and I fell asleep only to be wakened by the dentist himself. He introduced himself and I expected an update on my mother’s surgery. She was fine but this was a different visit – he wanted to tell me how much my mother reminded him of his mother who he lost in 2001. He brought photos and shared how he felt closer to Mom than most of his patients due the resemblance. I let him know that I knew I was lucky to have her and thanked him for taking special care of Mom. I later found he told the staff I was his “other” sister. Strange encounter – but not unwelcome.

When Mom was at checkout – I found my stomach was not going to stay quiet and ended up sick in the ladies room. I did feel better afterward in some ways – but very empty. Things were not settled enough to feel comfortable eating enough to feel the void.

After several stops on the way home – I crashed. I really needed to get home to my family. Today was my son’s Ryan’s 16th birthday. A big milestone for a youth and I was missing it. If I attempted to drive home at this point – I would never make it – not how I wanted him thinking of his Sweet 16. So I laid down and took a nap after calling and letting everyone know where I was. That call let me know that my loving husband now had the ear/nose/throat infection I was trying to kick and he was not happy about it.

I slept and was woke up by Mom’s new cat – Boots. I got things packed and loaded into the car and headed home. I got home around 8:45 and loaded my son in the car for his birthday dinner – better late than never. He chose Macaroni Grill – the same place for the last 3-4 years. Over dinner I got a question that made me warm and proud. “Mom can you teach me SQL so I can better track the data for my Yugioh cards?” My kid has been paying attention. He saw me prepping for my recent presentation “Taking the scary out of Monster Reports” and liked what I was able to do with the data. Now he wants to do the same types of things.

All the crappy parts of the day disappeared. It was getting played forward to me. He watched me take care of my Mom – always telling him that I have to take care of her while I have her. He was now saying he wanted more time – close time with me. So I gave him a run down of SQL over pasta. We’ll see how serious he is – but it stills feels wonderful knowing he has an interest.

 

March 22, 2012

A New Presentation – Monster Reports – Part I

Filed under: Presentations,SSRS — sqlpam @ 10:35 pm

Taking the Scary out of Monster Reports – this is the title for my latest presentation. For the past 4-5 years I have had one presentation on reporting services – Tips and Tricks of Reporting Services that has been presented at least 15 times. It discusses the joys of the dynamic features in SSRS that allow you to modify the report’s appearance and behavior by using data. But it was getting old – way old and it was time for another main line presentation. I must note – this was not the only presentation over that time frame – just the most well received.

I write reports for my clients. Recently, I saw a trend to more complicated reports or what I refer to as “Monster Reports”. I tend to cringe when I see these. They are very important reports to management. They usually mean that instead of digging to the bottom line of a number of reports – they have one report that encapsulates what they need to see.

I have noticed two directions for these reports. One has a lot of disparate data on the report while the other has a lot of disparate controls on the report.

The disparate data report is usually a report that combines the subtotal/total lines from a number of reports. A lot of time they look at fixed period(s) in time to display associated totals. But sometimes they get even more complicated by spinning in different columns within the multiple subtotal or they request charts to visualize one or two of the result sets. These can be a beast to create and maintain.

The disparate controls report has a similar challenge in that you are pulling back very disparate data – but the presentation is what differs so much with this one. You are attempting to wedge multiple charts, grids and potentially gauges on a single page – possibly more. They want is packed tight with few gaps. But they want it nice looking.

So the point of the presentation is to supply the attendees with a number of strategies to help make these “Monster” reports less “Scary” to tackle. So I will be posting a series of blogs over the next or so to go into a little more depth than the typical hour presentation will allow. My carrot to the attendees is that at the end of the presentation – they will have access to a series of RDLs that will allow them to start to monitor their Reporting Server Database. The idea for the topic was inspired by Jessica Moss (Blog | Twitter) who dids a presentation at PASS Summit 2011 entitled: Preventing the Oh, Poop! Reporting Situation. I liked that at the end of the presentation the attendee actually had something useful. So I built on what she supplied.

The next blog will be on the Disparate Data reports and the techniques I employ the tame them.

A renewed experience

Filed under: Community — sqlpam @ 6:31 am

Last night I did something I have not done in a long while – I visited a SQL User Group other than one of my local groups. You tend to forget how different we are until you step outside your box. Last night I drove the hour and a half from my place over to MagicPASS in Orlando. This is a SQL group run by Kendal Van Dyke (Blog | Twitter). I was a little surprised when I saw in the announcement that the meeting start at 5:00 pm with a video presentation. Driving over – I thought about it. Having a time when people can feel comfortable just dropping by – without feeling they were dissing a speaker – might get that bit of the group that I lose because they have an hour to kill between work and group. Arriving – I found that it was also a great opportunity to network / reconnect with those members as well.

Kendal amazes me with his depth of knowledge and more – his ability to always seem to remember names with faces. This is a major failing of mine. In his group, I saw he helped this out by having everyone wear a name tag – another note to consider for future meetings. As time came for the main presentation – Kendal started with the announcements. This was the time when everyone should be there – right? The announcements were lengthy – but full of very useful – relevant information. He started with the supplied PASS slide deck and worked from there. Additional information included new information on certification – along with the announcement of forming study groups. All great info. Somewhere in all of this the food arrived – nice and hot – delivered by his wife and children. By this time in my group – the speaker was half way thru the presentation if not really warmed up. We took time to eat and network – that critical piece – network. When everyone had their fill – I was asked if I was ready. Oh no – the panic set in. But these were friendly people – it won’t be an issue.

I got a wonderful intro by Kendal and explained that this was the debut of this presentation. I got a little more than half way thru my material and Kendal informed me that it was 8:45 pm. I almost freaked – how did the presentation get so long? I know there were lots of questions – which is good – but really – almost 9:00 pm? I had been in this room for almost 4 hours at this point – but it was a full 4 hours.  Lots of information exchanged and not all of it from me.  I wound down. We finished with raffles and distribution of shirts and lapel pins left over from my SQL Saturday. We rearranged the room and headed out to the local watering hole for even more networking.

I am not sure what of this I will incorporate back home – but I know my comfortable view of the home meeting has changed. It was great to meet some wonderful new people. I even joined a certification study group to assist me with one of my goals for the year. They will be held via live meeting – so the distance is not an issue it would be otherwise. I came away much richer than I arrived.

Thank you Kendal!

Next Page »

Blog at WordPress.com.