Changes

Jump to: navigation, search

Tips for large databases

403 bytes added, 02:43, 5 January 2022
m
no edit summary
{{out man tip|The advice on this page was for older versions of date|Gramps so may not work for you. Please update as needed.}}{{stub}}<!--check accuracy}} / Incorporate "Import tested with the GRANDMA Mennonite database of 1.4 million people. by user on reddit!! https://www.reddit.com/r/gramps/comments/dzevcl/database_size_limit_for_gramps/fb6hdbj/ " -->
Large family tree data files, what to do, and what not to do.
==Loading the file==
{{man note|Should work for BSDDB versions of Gramps}}Initial import of a large (100,000+) database from either Gramps XML formats or GEDCOM is can take a few hours. You will need to adjust thenumber of allowable locks. For 140,000 people you should use:
* ''max_locks'' 300000
* ''max_objects'' 300000
As there are many people, loading the person view might take some time. See a comparison of [[Gramps Performance|performance]] on large datasets, between different Gramps versions.
You can circumvent this load time by going to the [[Gramps Relationships Category view screenshot|relationships view]] before opening the family tree. This will open the active person and their family extremely fast, independent of the amount of data you gathered.
You can add bookmarks to common people, or in the branch you are researching. This allows to change person in the relationships view without the need to activate the person in the person view first.
== Avoid Gramplets ==
Avoid gramplets which do a lot of database work. The '''[[Addon:Deep_Connections_Gramplet|Deep Connections]]''' Gramplet seems to be the worst case. These Those type of gramplets slow everything down enormously.
== Avoid general filters ==
4,530
edits

Navigation menu