How to handle heavy data in .NET


hi

i dealing huge data. data stored in files , application loads memory in float arrays.
the size few hundreds of mb. there can multiple such files.
currently keeping entire data in memory performance. because of that at time not able open more 1 file gives system out of memory exception.
can suggest me techniques can achieve performance better memory utilization. working in 3.5, c#

the application kind of image processing application huge data displayed on chart(third party) , can manually interacted with.

thanks , regards
- jc

which it? 512*sizeof(float[32768]= 66mb, or 512 * = 32gb?

66mb tiny, , shouldn't have trouble @ all.

32gb huge, , you're going pushing boundaries of avaible physical memory, , virtual address space @ every turn.

you might think implementing own lru cache manage gb or of float[32768]s. advantage of doing rather relying on garbage collector, is choose policy controls segments discarded. garbage collection based scheme going lose weak-referenced large object data on suitably high-level garbage collect; , if don't have weak references, garbage collection won't collect garbage. 

reading 32gb of data off disk isn't going fast. should going in modest expectations, unless there brilliant strategies reducing total working set @ given time.

if have touch every byte of 32gb of memory every time process data, may serially, 1 file @ time. lru cache isn't going help. @ point, you're not bound memory, your'e bound how fast can read 32gb of data off disk.


.NET Framework  >  Common Language Runtime Internals and Architecture



Comments

Popular posts from this blog

Azure DocumentDB Owner resource does not exist

job syspolicy_purge_history job fail in sqlserver 2008

Trying to register with public marketplace error with 'Get-AzureStackStampInformation'