SQLPam's Blog

March 15, 2012

SQL Saturday #110

Filed under: SQL Saturday — sqlpam @ 8:20 pm

So we have another SQL Saturday Tampa on the books.  This was our fifth – our fourth at KForce who supplied facilities.  Based on the reviews – it was another success.

This year we had two last minute precons held on March 9th, 2012 at the Ybor City Hampton Inn.  One session was for the DBAs led by Eddie Wuerch (@eddiew) entitled The DBA Skills Upgrade Toolkit The other session was for our BI audience led by Bill Pearson (@Bill_Pearson) entitled Getting Started with PowerPivot and Other Microsoft  Business Intelligence Topics.  The events were opened to the public only a couple of weeks before the event – so an attendance of 34 was a pleasant surprize.

We had over 40 people attend our Pre-event dinner at Spaghetti Warehouse.  This was a good time by all.  It was a chance for our speakers, sponsors, and volunteers to get together and let their hair down before getting to work on Saturday.  This was the core of the people who helped make the event a great success.

The morning of March 10th, 2012 found us at KForce very early in the morning.  This year registration opened at 7:00am to allow for the elimination of one room that was not conducive to learning.  Instead of six time slots we had seven – so that meant a very early start.  Registration was a breeze for those that pre-printed the SpeedPasses supplied by the SQL Saturday site this year.  This did create some confusion.  Last year, since I was prototyping the process, each registrant got a SpeedPass delivered via email.  The common thread from those that did not print them was that never received the PDFs.  I need to do a better job of clarifying what the process is next year.  We also had issues with the SQL Saturday site’s SpeedPass process on the day of the event which created a few glitches for those that did not pre-print the SpeedPasses.  This is a work in process – so all information has been submitted to PASS.  The site continues to improve – so I see these wrinkles being resolved soon.

We had 42 scheduled sessions led by 40 speakers and 1 co-speaker.  We did end up with 2 no shows the day of the event.  One of these was filled by a speaker on site with a similar topic while the second was left unfilled.

We had 20 sponsors for the event with 11 on site.  We had three sponsors that were bloggers trying to get their names out there – my hope is they got the exposure they were hoping for.  We also had three Bronze,  five Silver, seven Gold and one Platinum.  Although this looks like a lot of sponsorship money, there was a lot of this that was in trade for needed materials for the event.  My hope is that each sponsor got what they hoped to get out of the event to encourage them to continue sponsoring SQL Saturdays in the future, not only Tampa – but all events.

We saw a marked decrease in registrations this year.  There were several reasons that I assumed caused this.  First, we changed the time frame we had our event.  In past years it has been held soon after the beginning of the year.  This posed challenges with sponsorship budgets – so this year we moved to March.  I received some emails stating that they planned on coming in Januray – but March was not possible.  The other factor is the number of SQL Saturdays that are occurring.  I think that that the sponsors, speakers and attendees are reaching a saturation point.

A big round of applause should go out to our volunteers.  They were the silent partners that made the event a great success.  I would like to give a special word of thanks to two of my volunteers.  Karla Landrum (@KarlaKay22) is now a PASS employee – but she went well above the call of duty to assist me with details that boggle the mind.  Brooke Ranne (@BabblingBrook01) managed the registration desk for me.  She has gotten it down to a science by volunteering at a number of the events in the south east.  On Friday night, the volunteer staff stuffed 300 event bags in less than an hour.  I have to say – my team kicked butt.  My warmest gratitude to each and everyone.

Lots of things went right – but besides registration glitches we had a few things that raised concerns from attendees.  I heard concerns about signage – we had more this year than ever before – I guess there will have to more next year.  There was also concerns raised that although the Guide Book was mentioned in a pre-event email – there was not enough information for attendees to know how to use it.  The cardinal sin of all was that we ran out of coffee.  Last year it was overkill so I backed off – looks like I need to gear it up a little more for next year.

Thank you to everyone who spoke, sponsored, volunterred or attended.  You each added to the community.  For those first timers – I hope to see you again in the future.  For those that came back after a previous attendance – thank you for continuing the cause – free training and the expansion of the SQL Family.

 

Advertisements

February 10, 2012

In SSRS, Long Tool Tips Don’t Auto-Magically Wrap

Filed under: SSRS — sqlpam @ 5:58 pm

So here is the issue: The user wants detailed information as to how data is derived.  There is no way to shortcut it – you have to get wordy.  So you store the verbiage in a VARCHAR(MAX) column and populate the tool tips using this. 

Works great!  There is the verbiage and off the screen it goes.  What?  It isn’t smart enough to wrap to stay on the screen – not really.  

So now what? 

Here’s what I did. 

I found the longest tool tip in the table and then broke it out into 100 character segments.  I created new fields for each needed segment and updated them in the table. 

But it was more than that – I broke it out based on breaks so that words were not chopped.  My solution was not elegant but here it is:

My original tool tip field was GrpToolTip.  I added GrpToolTip1, GrpToolTip2, GrpToolTip3, GrpToolTip4, and GrpToolTip5 – all VARCHAR(100).

I look for the original Tool Tip to be greater than 100.  I determine the location of the first space after 80 to allow for longer non-breaking text and pull the left most characters based on the results.

If the original Tool Tip was less than 100 characters – I replace the new appropriate new Tool Tip with the original.

Next I update the original Tool Tip to remove the data moved to the new tool tip.

Then repeat the process thru all the new fields.

UPDATE rptReportTemplate
SET GrpToolTip1 = LEFT(GrpToolTip, charindex(‘ ‘, GrpToolTip, 80))
WHERE LEN(GrpToolTip) > 100

UPDATE rptReportTemplate
SET GrpToolTip1 = GrpToolTip
WHERE LEN(GrpToolTip) <= 100

UPDATE rptReportTemplate
SET GrpToolTip = REPLACE(GrpToolTip, GrpToolTip1, ”) 

UPDATE rptReportTemplate
SET GrpToolTip2 = LEFT(GrpToolTip, charindex(‘ ‘, GrpToolTip, 80))
WHERE LEN(GrpToolTip) > 100

UPDATE rptReportTemplate
SET GrpToolTip2 = GrpToolTip
WHERE LEN(GrpToolTip) <= 100

UPDATE rptReportTemplate
SET GrpToolTip = REPLACE(GrpToolTip, GrpToolTip2, ”) 

UPDATE rptReportTemplate
SET GrpToolTip3 = LEFT(GrpToolTip, charindex(‘ ‘, GrpToolTip, 80))
WHERE LEN(GrpToolTip) > 100 

UPDATE rptReportTemplate
SET GrpToolTip3 = GrpToolTip
WHERE LEN(GrpToolTip) <= 100

UPDATE rptReportTemplate
SET GrpToolTip = REPLACE(GrpToolTip, GrpToolTip3, ”)

You get the picture…

Now that I have all the data segmented into the proper lengths, I update the stored proc that feed the report to include the new segmented fields instead of the original field.  I need to refresh the report’s dataset so it sees the new fields. 

The next step is what all this was about.  Where the tool tips were displayed, instead of just setting the tool tip to the original tool tip field – I utilize an expression to force the next line down.  Here is the expression: 

=IIF(LEN(Fields!GrpToolTip1.Value)>0,Fields!GrpToolTip1.Value,””) +
          IIF(LEN(Fields!GrpToolTip2.Value)>0,CHR(10) + LTRIM(Fields!GrpToolTip2.Value),””) +
          IIF(LEN(Fields!GrpToolTip3.Value)>0,CHR(10) + LTRIM(Fields!GrpToolTip3.Value),””) +
          IIF(LEN(Fields!GrpToolTip4.Value)>0,CHR(10) + LTRIM(Fields!GrpToolTip4.Value),””) +
          IIF(LEN(Fields!GrpToolTip5.Value)>0,CHR(10) + LTRIM(Fields!GrpToolTip5.Value),””)

Note that I only include the CHR(10) when the segment is populated and I also strip off leading spaces for an even left edge.  That could have been accomplished with the above code with a little more effort – but it works here. 

Not rocket science – but hopefully it will save a few steps for someone in the future.

February 1, 2012

and so it begins… SQL Saturday Tampa 110

Filed under: SQL Saturday — sqlpam @ 10:22 pm

I am starting the push for getting things together for our next SQL Saturday in Tampa.  This will be my fifth.  Every year seems to be a little bigger and hopefully, better.  The call for speakers just closed.  This year it has been a bit daunting.  We received 110 sessions by 43 speakers.  Based on the numbers, a lot of these were multiple sessions per speaker.

I want to try something a little different this year – based on some of the recent events put on by PASS, I am asking the community to assist with selecting the sessions.  The first pass thru the mix will be to limit one session per speaker.  To this end, I have done the geeky thing of importing the sessions and writing a report that would give me the information to best make the decisions- in SSRS of course – and exported it to Excel.  It is now located at http://tinyurl.com/6uwy7s4If you are game, please download and fill in your preferences for each speaker.  It will look empty so you will need to navigate to the top.  When you have done what you want – plese email it to me at SQLSaturdayPam@live.com.  Your assistance with would be greatly appreciated.

I’ll keep you informed as to our progress…

Ammendment: I received a response via twitter that had a comma delimited string of session IDs – this worked well if you would prefer to not mess with the spreadsheet.  Thanks again

October 24, 2011

Give Camp Tampa – 2011

Filed under: Community — sqlpam @ 4:13 pm

This past weekend I had a new experience.  Although I have frequently been involved in work for charities, I found a new opportunity.  I was able to use my technical skills for a deserving group. 

David Liebman organized the GiveCamp Tampa.  He worked with DeVry University to supply the location and a number of sponsors to provide food and swag for the participants.  He was able to coordinate with 2 groups for us to supply services.  The first effort was to build a web site for the Conrad schools out of Orlando.  They are a school that assists special needs kids.  The second effort was a BI Proof of concept project for the FL Children’s Services Council.  This group advocates on behalf of children across most of the larger counties in the state to provide assistance for kids and their families.

We split the group into teams. we had 4 developers and charity sponsor on each team.  The developers who knew .Net took on the web page.  They developed a site that the sponsor was extremely pleased with.  When finished – they trained her on how to maintain it.  They seemed to have the easy project – but I heard there was one grumble about not writing much code since they used Dot Nuke for the site.

I went with the BI team.  It was an ambitious project.  Just getting things set up to start work took all of Friday evening and most of Sat morning.  We then hit our stride and split the ETL up so that each of us created at least 2-3 SSIS packages to transport the data into the warehouse.  Jose Chinchilla then led the effort to build the cube.  We started on Shore Point and reports on Sunday morning.  As usual, the reports reveals issues with data and structures – but these were worked thru quickly.  The proof of concept was begun – but it needs more work.  The sponsor was very pleased to have such a great start of the project.

I learned a lot from the process.  There are so many ways to approach any task.  Pulling together a team of volunteers lets each see the approach their fellow volunteer takes and expands future options.  I will definitely look forward to next year’s Give Camp.

A big THANKS to all who participated in their local GiveCamps across the US and rest the world.  I hope it grows bigger next year to assist many other organizations.

October 2, 2011

Tally Code Camp – I was there but not really

Filed under: SQL Saturday — sqlpam @ 2:48 am

Today I had the pleasure of speaking at Tallahassee Code Camp – but I was unable to attend.  Huh?  I led the meeting via Go To Meeting.  It is probably my least favorite speaking mechanism – but it is fast becoming a blessing for those in more out of the way locations.  I have to say that today’s session was my favorite online session I have led to date.

We got a late start because there were issues with the sound in the room – but once we got started it glitch free – at least as far as the technology.  I made a few boo boos that were easily recovered – but Go To meeting and the related technology work flawlessly once we got going.

The big thing for me was that the audience actually actively interacted.  This is something I have missed in all of my previous presentations.  I was able to get a feel for the how the audience was reacting based on the questions that were being asked.  I would still prefer to be there in person – but this was a nice second choice.

I want to thank Maureen Jugenheimer for offering the remote opportunity to participate in an event I was not able to physically attend.  She made sure the technology worked and actively moderated the session.  I look forward to meeting her in another week or so at PASS Summit.

April 27, 2011

Using Multi-Value Parameters in SSRS Datasets

Filed under: SSRS — sqlpam @ 1:14 pm

I was given a simple assignment to generate a Usage Summary report for the reports in SSRS.  To achieve the required results took very little time.  However, I decided to allow the user the ability to select which report user(s) could be displayed on the report.  Thanks to a recent presentation by Mike Davis, I was able to implement this with minimal effort.  My one gotcha was that I neglected to place parenthesis around the parameter – so it was a minor blonde moment.

As I played with the report a couple of days later, it struck me that instead of having to wade thru all the users of a report, it would be helpful to see all users grouped.  Yes – a grouping on the report would have achieved this – but the individual users would still be there.  I had the idea to present the option of All Users in the user selection list.  If this option was selected – by itself – it meant the user wants to see the summary of all users per report. 

My dataset is embedded in the report – not best practice but is the easiest under current circumstances.  This meant that to determine is a user was in the selected list all I had to code was:

UserName IN (@UserList)

The issue came in when I test in my query if the only selected user was All Users.  If there was only one selection from the list a simple test of @UserList = ‘  All Users  ‘ would work.  If more than one value was selected, I started getting errors.

I got around this issue by creating a new internal parameter.  I set this parameter to ‘MultiValue’ if more than one selection was made.  If only one selection was made, I set the new parameter to the selected item.  The default function for the new parameter looks like:

=IIF(Parameters!IncludeUsers.Count>1,”MultiValue”,
              Parameters!IncludeUsers.Value(0))

I now had a parameter I could test against to render the results I wanted.  If the new parameter was the All Users option, I set the User Name to All Users instead of the actual user name.  For readability, I did the initial query as a CTE and then grouped on the information in the CTE.  I had not thought to use characteristics of parameters as internal parameters until this need arose.  I am now considering where this might be helpful in other situations.

March 17, 2011

Cascading Parameters

Filed under: SSRS — sqlpam @ 10:22 pm
Tags:

Cascading Parameters

I have attended Mike Davis’ session on Advanced Parameters for SSRS several times now.  One of the coolest features he brings up during the session is cascading parameters.  In my recent online presentation for 24 Hours of PASS, I was asked how to make this happen.  Since I have not been able to find a write up by Mike to point to, I will take a swing at it here.

First, cascading parameters are when one parameter’s selection options are determined by another parameter’s selected criteria.  The example Mike uses is allowing the user to narrow down city selections by allowing them to define the state first.  Once the state is selected, only cities with that state will be presented as options. 

In my example, I will use some data I have on hand for a current project for the LPGA.  They often need to select the tournament to review.  Since they have data for 50 years, it would be nice to narrow it down to a single year.  So we will present the user with 2 parameters, the Year and the Tournament.

The first thing I do is set up my dataset to present the Tournament options to the user: 

 

 

Please note that I am filtering my list based on the Parameter @Year.  Normally I would add additional filters and order by options – but we are keeping this simple.

Next, I set up the parameter for the Tournament.  In this case, I will present the Tournament Name to the user, but actually grab the TournamentID as the parameter to utilize.  The setup looks like this: 

 

Please note that @Year must be presented before @TournamentID or there will be issues because @Year must be available for the data to pull the Tournament list correctly.

What the user sees is: 

Note they have not options for the Tournament selection when the parameters are first presented.  Once the year is entered they have options: 

Thanks Mike for making it easy.  This works starting SSRS 2005.

January 26, 2011

SQL Saturdays Gone Wild – My Response

Filed under: SQL Saturday — sqlpam @ 9:30 pm

Recently Karla Landrum posted a blog – SQL Saturdays Gone Wild.  I was eyeball deep in planning SQL Saturday #62 in Tampa – so I threw together a response to her which she asked me to blog.  I did not have the time it deserved so I placed it on the back burner.   Here is my response.

I just completed my 4th SQL Saturday.  The first was tough because SQL Saturday had not established itself.  The second and third got easier and easier.  We had established a track record and were able to point at previous sponsors who were relatively unknown who were now well know and doing well as ROI.  But at the same time the phenomenon had not really taken off.  This year, SQL Saturdays are becoming more common.  In previous years, scheduling was by quarter then every other month, then monthly.  This year it seems to be weekly and still I am seeing doubling up.  For the sponsors who once aspired to being national sponsors, that means a lot of budget required and still not a lot for each individual event.  

So far I have spoken more to the sponsorship side of things – but a similar issue is being raised on the session /speaker side of things.  The first year, we did not have that many speakers.  We offered 5 tracks, if I remember correctly, and we had numerous speakers doing more than one session.  We moved up to 7 sessions per time slot last year and still had a couple of speakers doing multiple sessions.  This year was the first year I had to turn speakers away.  It was painful.  My first response was “next year we need a bigger venue – more tracks” – but is that really what we need.  As it is, we have a night mare ahead just getting the scheduling in place.  Minimizing conflicts between speaker schedules and conflicts between common topics or popular sessions is already tough enough.  Is expanding the number of sessions going help?  It makes a single choice for each time more difficult for the attendee.  Bigger is not always better.  Keep in mind, the number of attendees per event is not growing proportionately.

So what are the planners supposed to do?  Scale back on expectations is the first thing that comes to mind.  Since lunch is the one thing that scales with attendance – let the attendee pick up the cost?  If so, we need to make sure that if the cost of lunch is the only thing holding an attendee back – you can cover the few impacted.  I know I will be giving less purchased give aways.  I still want to take care of my speakers – they do supply the content our attendees come to see – but it has gotten more expensive.  I now have more speakers than sessions because of co-speakers. 

For our sponsors, it is more difficult to provide solid ROI.  With the opt out option, ROI is not what it used to be.  I try to give the sponsor a little extra by introducing the sponsors in emails prior to the event as well as in the event guide.  You don’t want to cram the sponsor down the throat of the attendee – but they are paying to let the attendee attend for free.  The attendee gets a few emails that have short intros to the sponsors.  After that, the sponsor must woo the attendee to get them to hand over a raffle ticket which we provide.  We also have a BINGO card which helps encourage the attendee to meet the sponsors.  At the same time, the sponsor must find a means of interacting with the attendees that do approach them at the event.  Some are definitely more successful than others.  Relationships are an important part of this process.  Some sponsors are better at cultivating these relationships than others.

To offset the decrease in sponsor funding, we decided to hold 2 pre-cons, one focused on the DBA and the other on BI.  The issue I found with pre-cons is they are slow to build momentum.  You have to put money out to get them going.  The upfront costs are speaker travel expenses and booking the space required.  Even with early bird specials, the registrations were slow to show.   We did not get to the breakeven point until early the week of the event.  We did make a very small profit, but there were some monies to offset SQL Saturday expenses.  We may need to focus on local speakers next time.

I am still struggling with answers to all this.  I know that I am going to find a way to make it happen.  The education opportunities are too good to pass up.  The networking is more critical in these days than ever before.  So going without is not the option.  Going for bigger is probably not the answer either.  I believe that an equilibrium will be met in the next year.  I think that it will require an adjustment of expectations for both the organizers and attendees.

I welcome your ideas on the subject.  Like Karla, I saw complaints that there were too many sessions they wanted to see in a single time slot – so expanding the number of rooms is problematic.  I also saw complaints that there was not enough SWAG and no t-shirts.

The Speed Pass process at SQL Saturday #62

Filed under: SQL Saturday — sqlpam @ 8:08 pm

Recently, I put on SQL Saturday #62 with the assistance of Jose Chinchilla and many other volunteers. Since the biggest black eye from last year was the registration process, I have spent the past year trying to think of ways to make the process easier. I came up with a process that Jose dubbed “Speed Pass”. Since the big slow down with registration is finding the registrant’s personalized raffle tickets and name badge, I decided it would be easier if the registrant was able to print their own materials and bring them to the event.
Since the process is not currently part of the SQL Saturday site, more work was required than should be needed once the process is available on the site. Here is what I had to do to make it happen outside of the site.
The first step was to bring the registrants into a local database where I could access them. This meant printing a report available to the SQL Saturday admins and downloading it into Excel format. I then had to strip the title off the report so the first line had the column headings. I then stripped the spaces out of the headings to make cleaner field names. I used the Tasks/Import process to bring the data into an already created table which defined the fields more accurately. I then used a script to append any new registrations into a “permanent” registration table set up with primary keys and additional info I was extrapolating from existing data or updating as new data was collected.
The second step was to create the SSRS reports to print the personalized info for each registrant individually. Each report would accept the registrant’s primary key as a parameter and generate the report. The tough part was getting the information to line up properly when output to PDF. Once this was achieved I set up a File Share subscription for each report and modified it a little in the ReportServer database to allow for On Demand subscriptions. I went this route so that future users were not required to utilize Enterprise edition.
Basically, an On Demand subscription can be triggered via a script anytime a user needs it. This is based on a blog published in 2007 by Jason Selberg. A stored proc will manipulate an existing Subscription record so that the time to trigger it is now. Once the report is finished, the stored proc restores the Subscription record back to its original form.
The third step is to generate the file shares. This requires a script that loops through the registrants and generates 2 file shares, one for each report. These files were named based on the registrant’s primary key. They were stored to an isolated folder on my machine. This process took almost an hour per 100 registrants. Most of that processing time was spent generating the PDFs.
The next step was to actually email the PDFs that were generated. This was handled via a script that created a personalized email with the 2 PDFs attached. The emails were sent via dbmail. My biggest hurdle on this was that my ISP blocks port 25 and my only mail servers only used port 25. I ended up using my air card which was on a different ISP. One lesson I learned was to include the primary key in the subject line to make tracking those using Speed Pass easier.
The final piece of this was to track the registrants that opted to use Speed Pass. They let me know via an RSVP email. These registrants were marked off my list to print for the day of the event. A second script generated PDFs for those not using Speed Pass and these were taken to the printer the night before the event.
I had 450 registrations the night before the event. 150 people had responded that they were printing their own reports. On the day of the event, I received 175 speed passes printed by the registrants. I was able to tell the source because the emailed speed passes were a different size than what were printed for day of event. We had around 320 people attending, 305 were actually registered and went thru registration. 15 registered the day of the event.
The biggest thing for me was that there was not a bottleneck in the registration line this year – a big improvement over the previous year. The event evaluations applauded Speed Pass as allowing them more time with the sponsors. The only downside I saw to the process is that people overprinted the materials to allow them to stuff the raffle boxes. I had some names that I found duplicated 5-10 time in each raffle box. That will be my issue to resolve for next year.
I will blog the On Demand subscriptions in detail in the near future. Any event organizer willing to try the process is welcome to contact me. I will be very happy to share the code.

January 18, 2011

SQL Saturday #62

Filed under: SQL Saturday — sqlpam @ 6:57 pm

This weekend was the culmination of a lot of effort by a lot of people. 

We started with a Day of Data – kind of a pre-con for SQL Saturday.  This was held on Friday at the historic Italian Club in Ybor City near downtown Tampa.  The building is close to 100 years old and is beautiful.  We sponsored 2 simultaneous sessions.  The first on the agenda was Storage and Virtualization for the DBA led by Denny Cherry.  The second was Business Intelligence End to End led by Stacia Mizner.  Both sessions were well attended and gave me rave reviews.

We then had our Pre-Event dinner.  We met at the Spaghetti Warehouse – again in historic Ybor City – to host our Speakers, Sponsors and Volunteers dinner.  The main point of the event is to allow everyone an opportunity to mingle build tighter bonds in the community.  There were at least 75 people attending.  This included family and companions.  From what I could tell, a wonderful time was had for all.  We unveiled our speaker shirts and passed out a new edition – lapel pins – to all attending.

The next morning started the main event.  This was held at the national headquarters of Kforce and the building next door – La Tam.  This is the 4th Tampa event and the 3rd we have held at Kforce.  Every year they go a little further to make it as easy as possible for us.

I was a little nervous this year about the registration process.  Last year, it was a big failure.  We had to send people to classes without registering because of a major back up in the process.  This year, I was determined to not have this happen again.  I came up with a process that Jose Chinchilla named “Speed Pass”.  Basically, the big hold up has been the distribution of personalized printed materials to our registrants.  So I allowed them to print their own this year.  Part of the printing process included a “speed pass” ticket and name badge.  We had 461 registered on the site.  Over 150 people responded that they had printed their speed pass tickets.  Based on previous experience, I estimated that 50% of my attendees were participating in the speed pass process.  We normally have 25-30% non-attendance – which brought our estimated attendance to 300-340.  150 meant close to 50%.  It played out well – there was very little back up in the line on Saturday morning.  I will blog later on the technology behind the Speed Pass process.

We had almost 3 full boxes of food donated to the food drive.  I want to thank all who participated in the food drive.  Feeding America will be sending an evaluation of how many families will be fed from our donations.

The day was full of informative sessions including a Women in Technology round table.  Please keep your eyes open for the various blogs by our speakers.  We had so many great people speaking and attending.  I was fortunate that in this season of illness, I only had 2 speakers who were unable to attend – their sessions were easily covered by other speakers – thank you Andy Warren and Jorge Segarro.  I was astounded to find I had one speaker who actually faced blizzards to arrive.  A special thanks to Ira Whiteside of Melissa Data for braving the elements during his drive down from WI.

We ended the day in the usual swag-fest.  This was a time to acknowledge our sponsors, speakers and volunteers.  We also hear a little from all user groups represented.

I want to send a special thanks to Jose Chinchilla for all the assistance he lent this year in helping me prepare for the event.  Without him, I would have had to scale back a while lot.  If I try to thank everyone by name, I am sure I will leave someone out – so I will leave it at Thank you all who spoke, sponsored or volunteered.  You are why these events work.  Thank you to our attendees – your presence makes the effort worthwhile.

What a great community we have!!  I have seen nothing like it anywhere.  If you are not participating – I encourage you to get out there and join the best technical community around – SQL folks – you ROCK!!!

« Previous PageNext Page »

Blog at WordPress.com.