Monday, May 20, 2019

Confessions Article 2 first Published in GPUG magazine H1 2019


Confessions of a GP SQL guy
Brent Olsen CPA, CMA, CITP
Senior ERP Developer, ZAGG Inc.
Part Two, SQL scripting basics.


Lots of great GP solutions on the web out there, I love this community!  Once you find a solution that works for you, embed the URL link in your saved SQL script so that you will not only have the original reference for the solution but when you hold CTRL and click the link on line 7 of the code below, it opens the page as a new tab inside SSMS (SQL Server Management Studio).  Now you have everything you need to use this code in the future, how cool is that?  I used the following script recently from Microsoft Dynamics GP Community Forum and include it here as an example:


Most data manipulation scripts used to fix corrupt GP data include INSERT, UPDATE, or DELETE commands which are at the heart of SQL’s power.  But how do we wield that power for good not evil?  In the above example, the select command on line 11 is benign; no data will be changed when the entire code is executed.  The update on line 12 has the power to update every row in the table sy01402.  The WHERE statement on line 13 limits which rows will be updated, in this case only those rows with syDefaultType of 96 and the userid JosephBlow will be updated but only if both rows 12 and 13 are highlighted and run together.  This syntax allows you to write one query for both the select and update assuring that the results for the update will be what you are expecting from the select statement. 



Let’s talk about SQL scripting readability and annotation.  I like to use lots of notes so I don’t have to read through code to remember what it does.  The double dashes used throughout the script create a comment on one line causing the SQL code to the right of the dashes not to be executed if the dashes are included or highlighted when the code is executed.  I like to create reminders or even a checklist of steps to take since many fixes in GP require you to run reconcile or checklinks routines after adjusting data in the tables.  You can also use the characters /* and */, as I did on lines 19 through 24, to encapsulate an entire section of code which is great for secondary scripts which might be used to ferret out the errors or to test a view or stored procedure created.  Bottom line, none of the code in green will execute unless highlighted directly.   If the above code is executed in its entirety by clicking the Execute or F5, only the USE DYNAMICS and the select statement on line 11 would run using the where statement on line 13.  Also, I like to start a new script by writing the steps I want the code to take as a checklist and then fleshing out the code to do the actual work beneath the notes.  Referred to as pseudocode, this leaves your SQL code very readable to you or anyone who may need to execute the script in the future.
Back to the script, the code on line 12 is what does the actual work here.  If you highlight the code from the “Begin” to the end of line 13 then the records included in the select statement will be updated and you should receive a row count which should match the count from the original select statement which was just one row in this example. 
This can be confirmed by running the entire script and you will receive results with the new value updated.  If the row count is different you would execute the rollback tran on line 17 to reverse the update.  Once you have confirmed the accuracy of the results by executing the original select on line 12, be sure to run the commit statement on line 15 to finalize the update.  If you do not rollback or commit the transaction it will wait indefinitely for your response and cause all other transactions to queue up behind it.  If a user is posting after the update but before the commit, GP will basically stop responding as it waits for the open update transaction to finish. 
This code provides additional control over your updates.  If you are updating just one row but miss the where statement, the record count would reflect the multiple rows updated.  You would see this as you run the select statement to confirm the update, and to fix this, you just select the rollback tran statement on line 17 to start over.  A lot nicer than a database restore.  The primary role of the transaction with commit code discussed here is to automatically abort changes to the database if errors are encountered.  This error handling is crucial for production systems but is beyond the scope of this article, we use it here to give the user a second chance before data is committed with no recourse.
Here is another helpful tip when scripting in SSMS for GP updates: using the Shift/Alt keys to modify columns of data.  The SSMS is not a bad text editor, in the following screenshot I paste the four SOP numbers sent to me in an email which I would like to use in a where statement to update user defined data in the SOP10106 table:
ORD1000001
ORD1000005
ORD1000100
ORD1000105
If I copy that column of data into a query, I can highlight columns to add the single quotes and comas required for the where statement.  If I left click to the right of the value ORD1000001 as the beginning location, then hold the Shift & Alt keys as I left click at the end of the value ORD1000105 as the ending location then a thin blue line show the vertical “highlighted” area:
I then enter the single quote and comma and presto, all rows are updated, I do the same for the leading single quote and I have all the data ready for the statement (after deleting the extra comma on the last SOP number):
Doesn’t seem like much but if you need to update thousands of rows with a column of data it’s a quick fix.
Make sure to try these tips in a test database, in the next article I will discuss audit considerations in the administration of your GP databases.

Tuesday, October 23, 2018

Confessions article first published 9/24/2018 in GPUG magazine

Confessions of a GP SQL guy
Brent Olsen CPA, CMA, CITP
Senior ERP Developer, ZAGG Inc.
Part One, Visual Database Tools vs. SQL.
In my experience working with Dynamics GP, I have developed several ways to expedite development and address the inevitable data corruption by employing a rarely used feature inside SQL, the Visual Database Tools (VDT).  In this article I will share how I use this tool to manipulate data inside Microsoft’s SQL Server Management Studio (SSMS). I will also point some other VDT benefits along the journey, and in the next article, tips to make your SQL code less dangerous if you prefer that method.  I believe that the VDT is useful for both new and experienced SQL users in a variety of situations. This article assumes a basic working knowledge of SSMS.
Visual Database Tools
VDT is an artifact from an early attempt by IBM to develop a graphical query language to compete with SQL.  Like the Query By Example (QBE) interface MS Access offers, the Visual Database Tools add a graphical interface inside SSMS which is translated to, and executed as SQL code.  For you noobs out there, this means that you can create a graphical request and then see how Microsoft thinks is should be written in SQL, a great way to learn how to code. For you old timers, it is a quick and easy way to update/fix data inside GP without the tedium and dangers of SQL update and delete statements.  
Let me show you a quick example of this.  While posting a PM check batch, the A/P Specialist picks up her laptop and runs to a meeting- aborting the remaining posting process.  After running batch recovery she determines that while the checks printed properly she still shows invoice 115000 of voucher 00000000000000276 for vendor ADVANCED0001 as an open voucher still waiting to be paid.  She finds a solution online which suggests she run the following SQL statement along with Checklinks to clear this up:
UPDATE PM20000
Set CURTRXAM = 0.00000
Where VCHRNMBR = ‘00000000000000276’
This sets the current transaction amount back to 0 which tells the system that the voucher is fully paid and ready to be sent to history.  The Checklinks routine then dutifully moves the paid voucher to history. Life is good.
What happens if you highlight and execute only the first two lines?  Without the WHERE clause, SQL will update all rows setting each one to 0 and Checklinks will send all open transactions to history.  Life is not good (ask me how I know).
This is how you perform the same task using the VDT:
In SSMS Expand the Company database (TWO in this case), then expand Tables and scroll down to find the PM20000 table.  Right click on the PM20000 table and select “Edit All Rows”, this will open the table with all rows:
Your first impression might be that this is a spreadsheet you can manipulate, and in many ways it is but at a very simplistic level.  For example, you can double click the column to adjust to the widest value, but you can’t sort the column on the fly or apply column attributes like Excel.  In our example we are looking to update the CURTRXAM to 0.00000 for the voucher. Once you scroll down to find the voucher, enter the value in the row and column then press Enter or step off the row to commit (move up or down).  The benefit here is that it’s quicker and you reduce the risk of modifying more data than you intended to.
I also use this for testing when developing table triggers.  Changing the values in existing rows will test update triggers.  If you highlight the row on the left you can copy the entire row of data, scroll down to the bottom and insert it back in on the first empty row which has an asterisk (altering the key field(s) making it unique) to simulate a table row insert.  Stepping off the inserted row you will receive a SQL error immediately if there is a problem with the trigger; if you test the trigger by entering or updating a transaction inside GP the actual SQL error might be suppressed by the UI and GP can hang with no elegant way to recover.  On the other hand, pressing the Esc key in SSMS will reset the value to escape the errors.

Also like in Excel, the entire contents of the table can be highlighted by clicking on the upper left-hand column/row cell.  The contents can be copied and pasted into Excel or text editors like any other text data. Copy paste operations are memory intensive so I don’t do this with thousands of rows in a table but it is very handy for copying data into a temporary table that I use to integrate to GP with an integration tool or directly with an update SQL script.  Inventory Items seem to need a lot of regular updating. I keep a simple Item temporary table available in a Sandbox database outside of the GP company databases for the inevitable updates.
Tables can hold millions of rows of data, how do you find the one you are looking for?  This is where we get into the actual query aspect of VDT, there are 3 panes in the SSMS toolbar which are helpful during queries:
The Criteria Pane lets you enter criteria to build queries, sort and filter the results.  The SQL pane will display the underlying SQL script or query that will run based on the criteria, sorts, and filters entered.  While the SQL in this example shows a simple select statement you can still update data which I find to be one of the most compelling reasons to use VDT.  The Results pane, you guessed it, will display the results of your query.
A fourth Diagram pane allows you to create relationships between tables (joins) but is beyond the scope of this article.  If I need to join tables I prefer to write it in SQL.
If we enter the voucher number 00000000000000276 into the filter column and click Execute SQL you will see only the one document in the results below.  The CURTRXAM amount can be updated and will be committed when you step off the row.


Be careful with your VDT data changes, editing a table directly is open heart surgery on your data but then so is an update statement so code responsibly!  

Bullet proof SQL update and delete statements.
Not sold on VDT?  No problem, in our next article we will look at how you can create more secure update and delete statements in SQL scripts to produce the same results we reviewed in the Visual Database Tools.

Tuesday, November 1, 2016

GP2013 to GP2016 Upgrade, lessons learned

GP2016 upgrade is over, we laughed, we cried, we became one with the universe.  It's been a while since I had done a full upgrade so I was itching to go through it and get my hands dirty; careful what you wish for.  This is a brief list of some issues I ran into along with the solutions employed.  I will try to credit other authors for materials pilfered.

1.     REPORTS.DIC does not work consistently on a VM shared folder!  Make sure your sysadmin creates a windows share on a readily available server other than your gp sql server.  We have an "admin" server where I keep our SmartConnect UI with related services and this is where we have a share for the REPORTS.DIC.  Took a week to figure that one out since I realized the first shared folders were not created in windows...  Reports printed in the test environment but once we moved to production GP would sometimes lock on the user or in one case run for 30 minutes and finally return a generic SQL error.  The worst part was that as heavy SOP users, our order transfer functionality would hang because it was trying to print the transfer report but of course we thought it was a workstation issue.

2.     The GP upgrade proper ran quite nicely and even picked up an addon I missed in the first pass.   Seems very stable.  Of course, this was the easy part; the difficulty lies in the companion products and stored procedures/triggers added.  We worked up some scripts to identify the sprocs and reload while applying proper security at the same time.  

3.    I did everything I could in advance.  I upgraded SmartConnect and then for the gp2016 upgrade  just had to go into Setup/SmartConnect Setup and switch the SmartConnect Dynamics GP Connector/GP version to GP2016.  It rebuilt all the maps but was very clean with no complaints.

4.    I had our security admin create me a list of all workstations with GP and then used that as a checklist for the client installation.

Thursday, October 16, 2014

Setting up your Database environment for integrations and general GP management

When I take a new job the first thing I do after reviewing the database status (backups, Recovery Model, logs etc) is to create a Sandbox database.  This is a database that I use to stage imports into GP and to store quick table backups as I modify data.  The underutilized QBE interfaced is, for certain operations, faster and more reliable.

A quick example.  Purchasing tells you that a new item just entered gives the error "UOM Does not exist..."  You look at the usual suspects for missing setup on the item but nothing pops.  Finally you look at the iv00106 table and notice that there is no entry for that SKU.  The first thing I do is to back up the table to the sandbox database before I modify it.