#### correcting mismatch user id with login while migrating /refreshing database from one env. to other or
sp_configure 'allow update",1
go
use user_db
go
begin tran
update sysusers
set suid=suser_id(user_name(uid))
from sysusers
where uid between 2 and 16000
and suser_id(user_name(uid)) is not null
go
sp_helpuser
go
commit tran
go
use master
go
sp_configure 'allow update",0
go
Reporting: SQL Performance and Tuning
This is a list of some techniques used successfully at several different sites. Getting Maximum Index Usage 1) Verify tables have had "update statistics" on them ; Verify tables have had "sp_recompile" on them. 2) Verify any declared variables have the same data type as their corresponding columns - this is a common pitfall. 3) Force index usage as follows, with a hint: from customer (index idx_customer2) 4) Use SET TABLE COUNT Example: set table count 6 Then, compile the procedure, in the same session. 5) If temp tables are being used, put the temp table creation statements in one procedure, and the processing SQL in another procedure. This allows the optimizer to form a query plan on the already established tables. Example: proc_driver calls proc_create_temp_tables then, proc_driver calls proc_generate_report General SQL Programming - Plan for growth. Assume the driver table doubled or tripled in size; would the report still function ? - Avoid dumb comparisons in the where clause, like where @emp_id > 0 - use "WHERE EXISTS ( )" rather than "WHERE NOT EXISTS" - use "!=" rather than "<>" - use "IS NOT NULL" rather than "<>NULL" - use "IS NULL" rather than "=NULL" - avoid distinct if possible ; see cursor loop option below - use meaningful names for temp tables ... don't use #temp (lame) Report Structure Approaches 1) Single query Single query reports are rare - usually they involve getting a simple list together. - Don't try to 'shoehorn' SQL into one statement. Shorter programs are great for C or Perl applications, but this is not the case in SQL. Think "Bigger is Better" (and more maintainable). - Keep queries from using more than four tables if possible. 2) Cursor on driver table(s), with IF..THEN processing in loop Using a cursor for complex reports almost always increases performance when large tables and a lot of joins are involved. - Keep cursor queries from using more than two tables if possible, make sure this query performs well on its own. - Try to have a unique key of some sort available within the tables involved. Strange results have been known to occur when a cursor is scanning rows that are exactly alike. - Don't use cursors for updating. - Use IF statements for filtering results even further. In most cases: A code construct like the one below is better than cramming the logic in a where clause. IF <cond-a> BEGIN IF <cond-b> and <cond-c> ..... ELSE .... END 3) Set processing without cursors This technique should be attempted when even a cursor construct fails to achieve the desired performance. Basically, the driver query is re-run with each iteration of the loop. Sample, with cursor: declare cursor1 cursor for select emp_id, last_name, salary from employee open cursor1 fetch cursor1 into @emp_id, @last_name, @salary while (@@sqlstatus = 0) begin < processing > fetch cursor1 into @emp_id, @last_name, @salary end close cursor1 Sample, with set processing: select @emp_id = 0, @loop = 1 while (@loop > 0) begin set rowcount 1 select @emp_id = emp_id, @last_name = last_name, @salary = salary from employee where emp_id > @emp_id order by 1 select @loop = @@rowcount set rowcount 0 if @loop > 0 begin < processing > end end Transaction Log Filling Up ? If the transaction log is filling up, for tempdb or the main database, there is likely something wrong with the report logic. Things to check: - Instead of repetitively updating each row, can the values be obtained ahead of time, and then inserted with a single transaction ? - Are the "joined" updates occuring on each row once ? When updating using a join statement, make sure that the tables in question are joined in a way that avoids duplicate rows. Try running the SQL statement as a SELECT - check it out. - Are you cramming 500,000 rows from a temp table into a db table ? bTry elminating the temp table. - Create indexes on updated/inserted tables after the fact. - Use "set rowcount" along with "waitfor delay" if log problems persist *** A proper DBA will never extend the log segment, based on the needs of a single process. Forcing an index in a query
|
No comments:
Post a Comment