scaling / performance

classic Classic list List threaded Threaded
2 messages Options
Reply | Threaded
Open this post in threaded view
|  
Report Content as Inappropriate

scaling / performance

Derick Fay
Hi
I recently opened concurrently two slightly different copies of a  
bibliography with just over 3900 entries, all of which have external  
URLs attached.  With both of these open, one CPU core was going at  
100% and BD was beachballing for minutes at a time (on a 2ghz intel  
core duo 1.5GB ram macbook running 10.5.4 and BD 1.3.18), .  I was  
able to copy the content of one into the other, and search for  
duplicates by title (with a delay of several minutes between every  
action) but after deleting the 7750 or so duplicates, which I'd hoped  
would speed things up, instead BD beachballed indefinitely and I force  
quit after about 10 minutes.

I'm hoping to deal with this by splitting this bibliography up into a  
few different files.  But I'm more concerned about my main working  
bibliography, which is about 1900 entries & which I'd really prefer to  
keep in a single file for everyday use.  I've noticed some performance  
degradation going from 1200 or so to 1900 but nothing like what I  
experienced with 3900.  So it seems like there's a non-linear  
relationship involved....I have one more file cabinet of 300+ articles  
to add to that biblio. so I'm hoping it doesn't get much worse.

I don't know that these issues can be addressed without a major  
rewrite...but I would be more interested in performance-focused  
improvements than new features etc. in future versions.

Derick






-------------------------------------------------------------------------
This SF.Net email is sponsored by the Moblin Your Move Developer's challenge
Build the coolest Linux based applications with Moblin SDK & win great prizes
Grand prize is a trip for two to an Open Source event anywhere in the world
http://moblin-contest.org/redirect.php?banner_id=100&url=/
_______________________________________________
Bibdesk-users mailing list
[hidden email]
https://lists.sourceforge.net/lists/listinfo/bibdesk-users
Reply | Threaded
Open this post in threaded view
|  
Report Content as Inappropriate

Re: scaling / performance

Maxwell, Adam R
On 09/15/08 08:29, "Derick Fay" <[hidden email]> wrote:

> Hi
> I recently opened concurrently two slightly different copies of a
> bibliography with just over 3900 entries, all of which have external
> URLs attached.  With both of these open, one CPU core was going at
> 100% and BD was beachballing for minutes at a time (on a 2ghz intel
> core duo 1.5GB ram macbook running 10.5.4 and BD 1.3.18), .  I was
> able to copy the content of one into the other, and search for
> duplicates by title (with a delay of several minutes between every
> action) but after deleting the 7750 or so duplicates, which I'd hoped
> would speed things up, instead BD beachballed indefinitely and I force
> quit after about 10 minutes.

This is not expected behavior, particularly with the hardware you're using;
I can open and search a 20,000 item file with only minor slowdowns.  If you
can reproduce a beachball, take a sample (or 2-3 samples) with Activity
Monitor and post the output.
 
> I'm hoping to deal with this by splitting this bibliography up into a
> few different files.  But I'm more concerned about my main working
> bibliography, which is about 1900 entries & which I'd really prefer to
> keep in a single file for everyday use.  I've noticed some performance
> degradation going from 1200 or so to 1900 but nothing like what I
> experienced with 3900.  So it seems like there's a non-linear
> relationship involved....I have one more file cabinet of 300+ articles
> to add to that biblio. so I'm hoping it doesn't get much worse.

4000 isn't large enough to worry about splitting it into multiple files.
20-50K is where I'd think about that, and then only if you have a slower
(single core) system and/or numerous smart groups.
 
> I don't know that these issues can be addressed without a major
> rewrite...but I would be more interested in performance-focused
> improvements than new features etc. in future versions.

Unfortunately, it's impossible to make performance improvements without
specific information (e.g. samples).  Random guesses about bottlenecks by
developers and users alike are typically wrong, and lead to much wasted
effort.

--
adam


-------------------------------------------------------------------------
This SF.Net email is sponsored by the Moblin Your Move Developer's challenge
Build the coolest Linux based applications with Moblin SDK & win great prizes
Grand prize is a trip for two to an Open Source event anywhere in the world
http://moblin-contest.org/redirect.php?banner_id=100&url=/
_______________________________________________
Bibdesk-users mailing list
[hidden email]
https://lists.sourceforge.net/lists/listinfo/bibdesk-users
Loading...