I met with some customers today from one company. They took time from their busy schedule to travel to Redmond to spend a couple days to address some SQL Server and Foxpro issues. It’s always a pleasure to meet with customers and learn about their real world needs.
Apparently they have a VFP application that their clients love. They switched their back end data to use SQL Server, and their clients immediately reported a performance decrease.
When they start their Fox app, the first form takes many seconds to show. We discussed various possibilities for the slowdown, and came up with some reasonable things to try to find the bottleneck. Upon startup, the VFP form was retrieving 62,000 records from SQL Server using a query with various parameters. When the data was in VFP native tables, the startup time was unnoticeable. We talked about how changing the back end data store can introduce or move bottlenecks, and some possible design changes. If the number of records was small, the startup was much faster, indicating that the bottleneck was not the creation of the VFP form.
We also discussed using SQL Server Stored Procedures vs remote views vs SQL Pass Through, and the various VFP Remote View and Connection properties, which could help speed up their form.
I mentioned that you can create SQL Stored procedures dynamically via VFP code, and that SPs can be made on a temporary basis: create one just for a few seconds.
Another topic was propagating schema updates. If the customer’s client requests the addition of a new field, or the changing of a field type, then the remote view definitions and the SQL Server tables need to be changed.
Over a dozen years ago, I faced similar issues while an independent consultant in Honolulu. The tables needed to be changed, meaning that the associated code needed to be updated. To solve this, I stored the table & field names and types into a single schema table. When the application starts up, the current environment is compared with the desired schema table. If there are no tables, they are created. If there are schema mismatches, the table structures can be altered. This process is fairly quick and can be done every time the application starts. This ensures no mismatches between code and data schema.
With SQL Server tables added to the mix, the same method can be used. To make it faster, the schema table can be uploaded to SQL server and an SP can be created to update the
The example below is modeled on the sample in SQL Server interpreter sample. It adds creating a stored procedure with a parameter, invoking that SP, removing that SP, dynamically adding a column
SQLDISCONNECT(0) && close All connections
LIST MEMORY LIKE aa
IF SqlDoIt(nh,"use test")<0 && If we can't use the test database (alt: use sp_helpdb)
SqlDoIt(nh,"Create database test")
SET TEXTMERGE ON TO memvar myvar NOSHOW && put the following text into a variable called myvar
SQLTABLES(nh) && get a table of tables
SELECT * FROM sqlresult WHERE table_type="TABLE" AND table_name = "cust" INTO CURSOR foo
\DROP TABLE cust
\CREATE TABLE cust (name char(10),amount tinyint,tstamp datetime,myxml ntext)
FOR i = 1 TO 10
\INSERT INTO cust (name,amount,tstamp,myxml) VALUES ('test<<TRANSFORM(i)>>',
\alter table cust add newcol char(10)
\update cust set newcol='foo'
\select * from cust
\!list off name,amount,tstamp,myxml
* \drop procedure mySProc
\create procedure mySProc @parm1 char(10) as select * from cust where name='test8' or name=@parm1
\!?"Exec my sproc"
\!list off name,newcol,amount,tstamp,myxml
\drop procedure mySProc
* \!brow last
*use master database, so test is not used anymore
\drop database test
SET TEXTMERGE to
n=ALINES(aa,myvar) && put the lines into an array
FOR i = 1 TO n && for each line
IF SqlDoIt(nh,aa[i])<0 && Execute it
LIST MEMORY LIKE aErrs
PROCEDURE SqlDoIt(nH as Integer, cCmd as String)
IF LEFT(cCmd,1)='!' && use "!" for Fox commands
&cCmd && execute the command