Architecture approach - scoring (weighted)


hi all, i've been brought project try , determine best approach problem described hereafter , bit different i've dealt before , hoping may have insight solution. project goal take inline sql executed each record (either insert or update) across several tables. number of records can exceed on million. in production database , not recommended have process cripple performance daily business. processing against weighted scoring records , way scoring determined can set user, update comes play on same records. performs foreach , cycles through each record applying inline sql 1 record @ time, takes entirely long complete. thoughts on possible approaches move stored procedure, use asynchronous portions of .net 4.0 framework, move ssas cube, use , ssis package perform job, or combination thereof. please let me know if glaring solution seen , in advance insight. 
enigmatic one

processing datasets ought slower compared straight piece of sql , going processed?

surely not on client?  hauling million plus records across wire twice? 

if it's on database server double memory required on set based processing.   how memory 2 million records?   how many concurrent users can this?  bringing oltp dabase server grinding halt pretty unpopular.

 

if need recalculate intra day have problem.

what happens when data used whatever it's used , part of data recalculated, part not?

i'd rather have clear rule changes reflected next day.

 

so here's option consider.

if rules can encapsulated in select statement use view , forget holding calculated values.

you both.  offer yesterdays value , current value.  yesterdays being held in column rewritten overnight job or jobs using value out view.

 



Architecture  >  Architecture General



Comments

Popular posts from this blog

Azure DocumentDB Owner resource does not exist

job syspolicy_purge_history job fail in sqlserver 2008

Trying to register with public marketplace error with 'Get-AzureStackStampInformation'