Start a new topic

helium12 does not use all the power and slow down...


i had 400GB of file music to import.

The process was very long whereas my hardware is quiet confortable (ssd, 6 core cpu, 12GB memory).

Task manager show that helium only use 10% of the computer...

i wish it could use 50% or more to make the import task  quicker .

Best Regards

(215 KB)

This is not a bug and can unfortunately not be made (much) quicker with such big collections.

You cannot just review it via the Task manager since the work is spread over multiple processes, so it is not true that it uses 10% only.

Some parts are under reviewing for minor optimizations, but the speed problems should only occur when adding much files at once (>20.000), hence we do not rate it as a major issue since this is not treated as a "daily task"

But in my observations, even when calculating only (PM or Gain) Helium uses around 10% of my CPU, suggesting that some improve might be possible by multithreading or something like this.


BPM, I mean.


@Infusion: Yes, but please note that that are other operations. During an add Helium delegates CPU time to the database drivers, which will consume the most time (depending on the operation).

For gain calculations, external tools are used which is limited. The same (kind of) applies to BPM calculating when we use BASS to decode the audio stream to be able to calculate the gain, which is a constrain.

do you commit your SQL query one by one ? or do you commit then 10 by 10 in a lot (with comma separate or using transaction lot). first is very slow, second is more faster.

also, i notice a very memory use during the process.

i guess you proced like that :

- read the folder - file information

 - prepare the query,

-  commit the query,

- take care of exception or error

- go to next file....

maybe you could process like that :

-  read 20 (or more)  the folder - files information

- store the information in memory

- prepare the query and commit it

- take care of exception or error

- go to next file....

or another way like that :

- count the number of cpu's cores (4 by example)

- create 4 threads to split the job-add process into multi thread operations

The submission depends on the active databasetype and work-type.

But they are always added in batches using the recomended technology and components.

Login or Signup to post a comment