-
FlexPro
- At a Glance
- Features & Options
- Applications
- All Advantages
- What’s New in FlexPro 2021
- Try FlexPro For Free
- FlexPro View OEM Freeware
- Buying Guide
- Login
- Language
- +49 6894 929600
- infoweisang.com
- Google Maps
- Products
- News
- Support
- Company
- Contact
- Login
- Language
- +49 6894 929600
- infoweisang.com
- Google Maps
Low performance when computing large files
- This topic has 6 replies, 3 voices, and was last updated 10 years, 9 months ago by Anonymous.
-
AuthorPosts
-
February 28, 2014 at 12:18 pm #12849AnonymousInactive
Hello,
We need to treat large data files (Dewesoft *. .d7d, Nicolet Recording Files *.nrf).
Those files with a size of 2 Gb contain :
– 6 to 8 channels
– 2 hours of recording at 5000 values per seconds
from where :
– 36 million values per channel
– 288 million values per fileCurrently, we use FlexPro to treat successfully smaller files (less than 500 Mb).
With those files we experience many performances problems : opending files, computing formulas (even simple ones, formulas applying to all the recording mixing 3 or 6 channels), drawing charts are very slow.
We found many optimization possibilities in the documentation, and tried them : it’s not sufficient.
The tests were carried out with:
– FlexPro 8 Professional
– computer : Windows XP, 3 Go RAM, Intel Centino 2 Core @ 2.4GHzThe most complex calculations (in particular FFT) were not tested.
Some other rival software have very better performances than FlexPro on this same computer.
We plan to test FlexPro 9 Professionnal on a more powerful machine under Windows 7 – 64 bits, to find out if FlexPro is suitable for this need.
Could you recommend a computer configuration (memory, number of processors/core, …) ?
Are there other solutions to explore with FlexPro ?
Is FlexPro suitable ?
Thanks.
February 28, 2014 at 12:18 pm #8509AnonymousInactiveHello,
We need to treat large data files (Dewesoft *. .d7d, Nicolet Recording Files *.nrf).
Those files with a size of 2 Gb contain :
– 6 to 8 channels
– 2 hours of recording at 5000 values per seconds
from where :
– 36 million values per channel
– 288 million values per fileCurrently, we use FlexPro to treat successfully smaller files (less than 500 Mb).
With those files we experience many performances problems : opending files, computing formulas (even simple ones, formulas applying to all the recording mixing 3 or 6 channels), drawing charts are very slow.
We found many optimization possibilities in the documentation, and tried them : it’s not sufficient.
The tests were carried out with:
– FlexPro 8 Professional
– computer : Windows XP, 3 Go RAM, Intel Centino 2 Core @ 2.4GHzThe most complex calculations (in particular FFT) were not tested.
Some other rival software have very better performances than FlexPro on this same computer.
We plan to test FlexPro 9 Professionnal on a more powerful machine under Windows 7 – 64 bits, to find out if FlexPro is suitable for this need.
Could you recommend a computer configuration (memory, number of processors/core, …) ?
Are there other solutions to explore with FlexPro ?
Is FlexPro suitable ?
Thanks.
March 3, 2014 at 9:29 am #9345Bernhard KantzParticipantReading large files takes some time especially when using slow devices (network, hard drives, etc.). When datasets can’t be held in main memory, they will be swapped out to the temporary folder, what lowers the performance while processing. E.g. if you work with signals with 36 million of values and use the default system setting for the maximum size of data sets in memory of 10 Megabytes, the data will not reside in RAM but has to be loaded from disk. In your case this size should be at least 600 MB to circumvent swapping. If you want to hold eight signals in memory you should adapt the maximum memory allocation for data sets settings to at least 5 GB. That implies a computer with a 64 bit Windows and not less than 8 GB of main memory. Starting with FlexPro 9.1 we offer a 64 bit release capable of using more than 2 GB of RAM. The overall performance of course also benefits from fast hard drives. This should boost the performance of computations when the data is read. But the actual loading of the files may still takes some time.
March 13, 2014 at 4:45 pm #9347HerveM1234ParticipantHi,
With 64bits FlexPro 9 running on seven64, I didn’t notice any difference with 32bits versions.
On 8 cores system (I7 3770), Flexpro use only 4 cores on 8 and 12% of CPU !!!
The memory used is allways less than 4GB and it takes a while to calculate formulas.Thanks !
March 14, 2014 at 9:34 am #9350Bernhard KantzParticipantThe main improvement of the 64 bit-version is the capability to use more than 2 gigabytes of main memory for datasets. Customizing the settings Maximum memory allocation for data sets and Maximum size of data sets in memory to the available amount of main memory allows FlexPro to hold large datasets in memory instead of swapping them out to disk. These settings can be modified in the System Settings tab of the Options dialog in the Tools menu.
On the same page, the parallel update feature can be activated in the Professional edition. Currently the parallel update is used mainly in the construction of presentation objects like diagrams, tables and documents. In FlexPro 10 it is planned to enhance the FPScript language like a parallel for-loop to improve the performance of the evaluation of formulas.March 14, 2014 at 9:53 am #9352HerveM1234ParticipantOK for that.
I still not understand why CPU is not fully used!
ThanksSorry : wrong picture above !
March 17, 2014 at 10:02 am #9353Bernhard KantzParticipantSince the parallelization is currently used only for the creation of presentation objects, the peak performance will be noticed at the time when documents containing multiple diagrams with curves based on large datasets are created.
-
AuthorPosts
- You must be logged in to reply to this topic.