SuperPAF/HyperPAF Idea

Discussions around Genealogy technology.
User avatar
huffkw
Member
Posts: 54
Joined: Sun Jan 21, 2007 6:34 pm
Location: Spanish Fork, Utah
Contact:

SuperPAF/HyperPAF Idea

Postby huffkw » Sun Jan 21, 2007 7:36 pm

SuperPAF/HyperPAF Idea

I attended the recent TechTalk in Salt Lake City (Jan. 18) and found it very helpful. I want to take this opportunity to put in my two cents worth of opinion. I am happy to see the Church starting a new set of genealogy data projects.

A new worldwide missionary/genealogy system is needed
My main personal interest is in seeing Church genealogy systems include a major strategic element that is aimed at serving the entire world and drawing hundreds of millions of people into using it. Our existing database systems, containing many Americans and Europeans who made up so much of our early membership, could be viewed as an extended “beta test” we are still working on before rolling it out to the world.

The overriding goal of the coming worldwide system should be to entice billions of people who have never heard of the Church to ask themselves why the Church even cares about genealogies, even as those people are actively and happily using the system to enter their own genealogy for preservation purposes. Their questions will eventually lead them to learn about our Plan of Salvation. Every person on the planet cares about their ancestors, and many include that concern in their religions.

In summary, I believe we need a truly generalized worldwide “genealogy research results storage” system that encourages and supports extensive cooperation, something that is quite a bit more than the specialized temple work system we have now. It is possible now to go far beyond the Ancestral File, Pedigree Resource File, and similar commercial efforts of an earlier day.

This system is intended to sit on top of all the large “raw genealogy data” databases in the world, acting as the lineage-linked summary and index to it all.

The elusive “One World Family Tree”
As with our past systems, it should do at least as much missionary work as genealogy work, but on a much bigger scale.

Success with this grand idea has eluded many a serious attempt in the past, but it is now perfectly feasible to do a really fine job of it, if the genealogy community is willing to study and accept a slightly different paradigm, a new way of thinking about the old goals and methods. This year seems like the perfect time to explore it, with all the new resources going into genealogy systems. I could describe the suggested system to any desired level of detail, but this is probably not the right time and place to try to do that. (I do have a running prototype at http://www.genreg.com – which needs more introductory verbiage and pictures. There is lots of theory (ten years worth) under the hood that is hard to spell out quickly).

At the risk of causing instant misunderstanding, skepticism, and controversy, I will briefly describe three key features. The most notable internal difference comes in storing the “finished” data in descendant format because that discipline almost automatically eliminates 95% of the duplication problems and questions we see today. The most notable external feature is that people could, theoretically, dispense with all their PC-based software and enter and modify all their data online, having near-instant access to everyone else’s best work as well, making ease of cooperation a major benefit. There are three software layers, 1) one that allows an individual to enter and modify his or her own data, as with today’s PAF, 2) a higher level “SuperPAF” that allows many members of an extended family (or workgroup) to enter updates simultaneously (with appropriate controls), and 3) the highest level “HyperPAF” which allows anyone in the world to view lineage-linked data that has been designated as “public,” and to enter competing, but nondestructive overlays of data for particular people or families. A third crucial feature provides for the explicit linking of name records to source records and images or other documents or multimedia objects, by actual URL or by more generic library call number references. No longer need these important links be buried in someone’s obscure notes, but can become part of the active database. Perfecting this new system using the American/European data “beta test” should allow it to be rolled out to the world with high credibility for all to use.

(Notes: 1. Purely practical and performance reasons would cause PAF-like software to be used for a long time, even if only as an assembler of data segments to be added to the larger collection. 2. This system could be viewed as a very specialized “wiki” linked to other less specialized wikis.)

This relatively inexpensive system (low storage requirements) would allow a person newly interested in genealogy research, anywhere in the world, Church member or not, to find out in a few minutes all the genealogy research that had been done in the world that is linked to him or her (or a near relative). (Their looking has nothing to do with temple work at this point). They could then quickly determine what new non-duplicative genealogy research they might profitably contribute to the whole.

This new system would be powerful for LDS use as well, because long before any questions about submitting temple work came up, all relevant completed genealogies (and related history, stories, photos, etc.) could be explored. This would almost completely avoid the current problem where people, especially new members, may work for years to gather family names for temple work, only to find at the end that 98% have had their work done multiple times before, probably because those names came out of Church systems in the first place. That is quite shocking and discouraging to many neophyte genealogists.

Incidental operational benefits
Strangely enough, implementing the new system, where “finished” data is stored, should eventually cut down drastically on the high volume of FamilySearch hits and other expected future open-ended searches of a similar nature. Besides its social and missionary contributions, it should more than pay for itself by perhaps cutting in half the number of servers the Church would otherwise need to deploy to support its genealogy systems.

Having the new system in place should offer a huge incentive for people to participate in the Online Indexing (microfilm data transcription) process related to the Granite Mountain Vault project, especially if people are able to choose the microfilm they wish to transcribe, film that has a high likelihood of relating to their family. Having a reliable central place to store some of the results of their transcription, especially the parts relating to their family name identification and lineage-linking, should give them a real sense of accomplishment, a job done well, once and for all.

In fact, if the database system I propose were done early, and the vault images were gradually put online (or copies of those same images offered by other online data suppliers were used), it would be theoretically possible to skip much of the (centrally driven) Online Indexing process as now envisioned for those vault images. People using the online images (or even the traditional microfilm copies), and linking them to their family members in the new database, using actual record keys or library-style references, would in the process be gradually creating an index to those images as a byproduct of their efforts. If desired, at some point a final visual comparison of the images and the partially completed index could finish off the indexing/transcription process for the most often-used segments of the image database. (The links of names to source images could be turned around, inverted. This view would begin with source images and show the names linked-to from each image. The results could be completed and corrected, if necessary, much like the current plan for Online Indexing)


Kent Huff

----------------------------------------
801-798-8441, cell 801-787-3729
1748 West 900 South
Spanish Fork, Utah 84660
huffkw@juno.com, http://www.GenReg.com
computer consultant, attorney, author
================================================

User avatar
huffkw
Member
Posts: 54
Joined: Sun Jan 21, 2007 6:34 pm
Location: Spanish Fork, Utah
Contact:

Less-abstract genealogy idea

Postby huffkw » Fri Jan 26, 2007 12:51 pm

Less-abstract genealogy idea

I see that several have clicked on my post about SuperPAF and HyperPAF, but no one has ventured a comment so far. I will assume for the moment that what I said was too abstract, or otherwise missed the mark, and try a different approach.

I will try to put the idea a little more concretely by expressing it in terms of changes to what I think is already on the drawing boards.
------------------------------------------
My suggestions for the next 4 upgrades to planned genealogy systems, as I understand those systems.

1. Add an online (originally empty) PAF-like database structure, separate from the New Family Search database(s), plus appropriate update and report screens or pages.

--In general, this feature should avoid many of the difficulties and limitations of multiple kinds of updates made against the New Family Search (NFS) databases(s) in place. For example, a user could choose one version of a name from NFS, move a copy of it to the PAF-like online database, and forget about any other duplicates that might exist. S(he) would be making no changes to the main databases. Any enhancements made to that name – relationship links, source document links, more complete data, etc., would be made in this separate, personal online space, for which only that person is responsible.

--It appears that updates to the NFS multiple databases may come from many different, uncoordinated sources, some inside the Church and some outside the Church, many potentially changing the structure of the data itself, changing the number of occurrences of certain names, etc. The suggested change would leave only Church-initiated updates to the NFS data, and should greatly lower the update programming logic complexity levels and chances for strange errors to occur. Avoiding data corruption in the big databases seems like a potential worry that would be removed by the suggested separation between source and target databases. Otherwise, the Church system may have to keep track of many versions of the databases, instead of just letting the users do all that in their new private space. The plan as I have heard it, essentially has source and target databases overlying each other. That seems unsustainable in the long run.

--Users can request private storage space, with password protection. Let a user put any genealogy-related data (s)he wants there.

--Anyone in the world (including “future members”) could use this space to store and update their genealogy data for free. This service would invite the world to learn something about the Church and it web offerings. These people could find this database useful, even if they have no interest at all in our temple doctrines or practices, and even if they can expect to find no data of interest to them from our existing databases.

--This private space could replace some or all of an individual’s usage of PC-based PAF-like programs.

--Data might be kept in the private space indefinitely, or compilations of data assembled there might be downloaded to a PC-based database.

--Complex Access Control specifications would not need to be applied to the private storage space, only to the NFS databases.


2. Add the ability to enter live links from private space records to specific NFS or Granite Mountain Vault (GMV) source record images.

3. Add features that would allow multiple family members to update the private space simultaneously.

4. Add features that would give the ability to anyone in the world to read data for deceased individuals in any of the private areas marked as “public access allowed.”

There are several more layers of upgrades I might suggest beyond these four, but these are plenty for the moment.

User avatar
HaleDN
Church Employee
Church Employee
Posts: 44
Joined: Mon Jan 22, 2007 2:08 pm
Location: Herriman, Utah, USA
Contact:

Postby HaleDN » Fri Jan 26, 2007 3:26 pm

Since you are looking for some feedback to your ideas, I'll try to respond to get the ball rolling...

huffkw wrote:Less-abstract genealogy idea
1. Add an online (originally empty) PAF-like database structure, separate from the New Family Search database(s), plus appropriate update and report screens or pages.


The value that I see the new FamilySearch offering is to jump start your pedigree so that you don't have to start with an empty database structure. Even though this data probably needs to be tweaked to be sure that it has the correct information and duplicates are handled correctly, this seems like a step in the right direction for many inexperienced genealogists who don't want to re-enter genealogical data which they know is already available.

huffkw wrote:--Users can request private storage space, with password protection. Let a user put any genealogy-related data (s)he wants there.


There are already online services that offer this service very well (e.g., rootsweb), but for which the gedcom files are completely independent of each other. Is your suggestion different from this?

huffkw wrote:2. Add the ability to enter live links from private space records to specific NFS or Granite Mountain Vault (GMV) source record images.


Yes, to make the system extensible, it needs to be able to point to other sources of data, including multimedia files, online documents, or references on other sites. But in doing so, this opens the issues of how to manage these external sources when they become unavailable either temporarily or permanently.

huffkw wrote:3. Add features that would allow multiple family members to update the private space simultaneously.
4. Add features that would give the ability to anyone in the world to read data for deceased individuals in any of the private areas marked as “public access allowed.”


The new FamilySearch will do this, if I recall the LDS Tech Talk presentation correctly. A few other current open source apps also do this (e.g., http://www.phpgedview.net/)

User avatar
thedqs
Community Moderators
Posts: 1038
Joined: Wed Jan 24, 2007 8:53 am
Location: Redmond, WA
Contact:

Postby thedqs » Fri Jan 26, 2007 4:08 pm

I believe what you are saying is that a bunch of seperate databases should be created for each individual user and that users can import into their databases information from the New Family Search (NSF). And that they can choose to import the NSF data into their own. They can also chose to upload their info to the NSF.

Well first off everyone can do that already, (except uploading to FS), with 3rd party. They only thing that you changed is that the church will host everyone's personal database, instead of having the database on their computer. From what is public and searchable, with the new system they just have to upload it, while all their information that is private will stay with them.

I might of misunderstood and if so I'd like some clarification, thanks.
- David

JamesAnderson
Senior Member
Posts: 748
Joined: Tue Jan 23, 2007 2:03 pm

Postby JamesAnderson » Fri Jan 26, 2007 4:26 pm

I see two things already that I had heard something about.

1. I had heard somewhere that they were going to have a way to point to an image that you find an ancestor in from the indexing that is going on now. So if you find an ancestor, you can find the image, and create a link in your tree to that image so that others can see what you have. Never heard anything more about that.

2. One of the reasons for on-site storage vs. home PC is that in some parts of the world, Internet cafes and similar sites are where people go to connect to the net. I'm not sure if thumb-drives will be the solution for carrying around large amounts of genealogical data, but I hear that that is the way many do it, they use the PCs USB port, work with their things, then take the drive with the updated material with them. It could work that way but how woulod such a scenario play out with genealogical data as right now most of these USB thumb drives only have around 5 megs although I could be wrong about larger ones.

User avatar
thedqs
Community Moderators
Posts: 1038
Joined: Wed Jan 24, 2007 8:53 am
Location: Redmond, WA
Contact:

Postby thedqs » Fri Jan 26, 2007 4:40 pm

JamesAnderson wrote:I see two things already that I had heard something about.

1. I had heard somewhere that they were going to have a way to point to an image that you find an ancestor in from the indexing that is going on now. So if you find an ancestor, you can find the image, and create a link in your tree to that image so that others can see what you have. Never heard anything more about that.


Understandable, luckily we have some others that have posted their notes, but in summary there will be a simple PAF like program which users can run and that will interface in the the NFS. This program most likely will be based on Java so it can go cross platform, (Maybe even have an applet form for those that can only use the internet) Although I think you have to place the picture on the internet before you can link to it. I might be wrong though.

JamesAnderson wrote:2. One of the reasons for on-site storage vs. home PC is that in some parts of the world, Internet cafes and similar sites are where people go to connect to the net. I'm not sure if thumb-drives will be the solution for carrying around large amounts of genealogical data, but I hear that that is the way many do it, they use the PCs USB port, work with their things, then take the drive with the updated material with them. It could work that way but how woulod such a scenario play out with genealogical data as right now most of these USB thumb drives only have around 5 megs although I could be wrong about larger ones.



The only problem is storage. The church is already using pentabytes of storage for FS and if everyone else stored a seperate database that copies the FS system I can imagine that we couldn't get enough storage. Small USB drives (64 MB) are not that costly now in the USA (I don't know about other countries but I supose that the prices for a small capacity USB drive are about the same as a floppy disk) And can easily hold the information needed. My 8 full generation and 23 extended generation database fits under 20 MB with some images. The only problem is when you get a lot of media.
- David

User avatar
huffkw
Member
Posts: 54
Joined: Sun Jan 21, 2007 6:34 pm
Location: Spanish Fork, Utah
Contact:

Replies to multiple comments

Postby huffkw » Fri Jan 26, 2007 6:38 pm

Thanks for jumping in, everybody. You already know I have an idea of what I would like to see the Church genealogy systems evolve into.
But getting your reactions, and trying to respond to them, can tell me if I am all wet, or if it really could be done as I hope, and whether other people would agree and value it.

I will jump around a bit:
-----------------------------------------------
thedqs: “the only problem is storage”

As I estimate it, the Church only has about 40 to 50 million unique names in all its active genealogy and temple ordinance files at this point. All the rest are duplicates of one kind or another, even if that totals 1 billion records. (At this point, it would be hard to get the real number of unique names).

Allowing 2,000 characters storage for each of 50 million names only comes to 100 GB total. Pretty small. All the 300 million deceased Americans would only take up about 600GB, still a small amount. Maybe spend $500 for a 1Terabyte drive.

Putting the whole “bit mountain” on disk in image and text format will truly take petabytes, with megabytes per image, but I am assuming that the file I am suggesting will have no more than short URLs or library-style general references to these huge image files, so it will not expand much.

A little extra overhead from some minimal duplication will also not hurt much.
-------------------------------------------
JamesAnderson: “Internet cafes . . . thumb drives”

I would like to see it be perfectly easy for people all over the world to go to a genealogy center, an institute building, a chapel, or a friend’s home and do this work. No need to carry anything around at all except a logon. No expensive hardware to break, lose, or have stolen. This matters a lot in most Third World countries.
--------------------------------------------
thedqs: “problem . . a lot of media.”

I assume there will not be extra media added to the file I suggest, but one could put photos on a wiki somewhere and store a URL to it in the central database.
----------------------------------
JamesAnderson: “point [a name] to an image”
I hope that is what will become of the whole “bit mountain” project. Most of the project’s value would be lost it that were not true.
---------------------------------
thedqs: “a bunch of separate databases . . . everyone can already do that . .. . 3rd party . . . upload public data.”

My hope is that by doing it the way I suggest, the central database will contain almost no duplicates, making it a real treasure. As it is, we have on our PCs hundreds and thousands of copies of the same names. Putting them up for public use in that form is only helpful to the most patient among us who will sift through it all for that little bit of gold dust. (I could suggest a new utility to help with avoiding duplication, but it could be done efficiently enough manually once there is a decision to put it on the central database.)
--------------------------------------
haledn: “(Originally empty) . . . .jump start your pedigree”

My assumption is that some group like the Hale family could add their large GEDCOM to the central file, and in the process, any names that were among the descendents of the original Hale ancestor would be numbered in a way to keep them differentiated from the rest, and that the deceased among that number could be made publicly viewable.

“already online services that offer this service”

It is not just the free space and indexing features I am interested in.
I am looking for a few important new features here that I don’t believe exist elsewhere.
1. The owner/user could update their GEDCOM data in place, not merely reload it periodically, and could add live links from names to other Church records such as images from the vault. References to source records move out of notes and into data elements/rows.
And the GEDCOM data would have been renumbered to allow for separate identification of the most valuable descendent-form data, all those with the Hale surname, thus avoiding much of the confusion caused by seeing data on hundreds of other offshoot surnames.

“manage these external sources when they become unavailable”

The work of creating a near-duplicate-free central database pays off handsomely in many ways, but one of them is the ability to create a unique number for each name in the central database. Once that is done, any outside database can then use that number to link its additional data, of any type, to the name in the central database – obituaries, photos, stories, source images, etc. This means that there really are no “broken links” if things move around. The central database need not even keep track of these outside databases, except perhaps to list their main domain name so that other programs can do the “distributed database” processing needed to marry up the all the data found in various places on one person.

---Thanks again for your thoughts. I hope these comments help.

===============================================

User avatar
thedqs
Community Moderators
Posts: 1038
Joined: Wed Jan 24, 2007 8:53 am
Location: Redmond, WA
Contact:

Postby thedqs » Fri Jan 26, 2007 8:46 pm

Thank you for clarifying a bit and I think that this is the general direction of the NFS. Though I think it goes more along the World Family Tree project where everyone uploads their pedegree and then can do work on it colaberatively, (although I believe living people would remain hidden except to their close relatives). Though how this would avoid duplicates any better then what the system tries to do now I need a little more explination. From what I understand the new NFS would show you any close matches and you get to decide if it is a duplicate or not, which from what I gather is the same basis of your suggestion.
- David

User avatar
huffkw
Member
Posts: 54
Joined: Sun Jan 21, 2007 6:34 pm
Location: Spanish Fork, Utah
Contact:

A new twist for dropping dups

Postby huffkw » Fri Jan 26, 2007 11:02 pm

Thanks for hanging in there.

I hate to be a negative spoil sport, and maybe we will have to wait until this summer to see what it is really all about, but I think people are expecting too many new features from what is planned for NFS. I hope I am wrong.

My mental picture so far is that nothing conceptually will change except that several more databases can be searched in parallel (yielding more hits), and the cycle of adding new ordinances to the IGI or membership records will be accelerated, so that new temple work or living member ordinances will be recorded every week instead of every 2 or 3 years as it was in the days of the Ancestral File. The more prompt updating is good, and this will help avoid some ordinance duplication for a few really intense genealogists that check names every week, but will not be a big factor for the many genealogists who only use the Temple Ready check as the last step after a large amount of research effort. We can assume, almost by definition, that if a name is in the NFS, their temple work has been done, so we are not likely to be covering much new ground there. There might be a lot of new ground in the “bit mountain” project.

I was told the IGI part of the NFS is 80% made up of single names receiving single ordinances, as it has always been. There may be huge numbers of duplicate records (for example 200) for some ancestors. There will be a few records with two names linked as in marriages or three names linked as in sealing of children to parents, as has always been. In other words, there is not much pedigree data here. Too bad the original family group sheets, from which some of this data came, which linked three generations, were not kept intact and linked together, but that is where we are, with only fragments in the IGI. The Pedigree Resource File, with its many GEDCOMs will be included in the search, and many duplicates will be found there, as has always been. In other words, very little will change from the current situation as far as solving the duplication problem.

The new twist
So here is where I would introduce a slightly radical idea to knock out most duplication in one swoop and hopefully end it once for all in the new database I am suggesting. Here is a simplified view: when a GEDCOM goes in, it is run through a utility program (I have tested the code) and only the largest single-surname descendent structure in that whole GEDCOM is kept. The rest disappears for public purposes. If I have 100,000 names, but only 3,000 are in a single-surname descent structure (like the Huffs starting in 1630 with Engelbert, or the Hales or the Leavitts) only the 3,000 is kept. Whole GEDCOMs will be dropped from public view if no descendant groups reach 1,000 in size. The idea is that the Huffs might be expected to do a good job on the Huff-surname descendants of a Huff, but shouldn’t be trusted or assigned to do any other surnames. If the Huffs married into the Thomases, let the Thomases do the Thomases, and no one else. Instantly, you have no duplicates unless there happen to be two large single-surname descendant structures with the same surname that made it through the utility. That is unlikely, but if it happens, then just keep the largest and drop the other. No dups. Then the data owners can continue to add to that one chosen structure, hopefully teaming up with all others interested in that descendant structure. The non-dup database will need links from wives in one structure to their place as a daughter in their maiden name single-surname descendant structure. Then the job is finished of putting links between structures.

Is anyone ready to do something radical to kill dups worldwide? I think many are.

I am guessing the final result of processing all the GEDCOMs from every source should yield something in the range of 20-40 million non-duplicated names, but there is no way to know until we do it. Then the fragments of families skipped over by the utility can be found and added in, with new features aiding that process.

User avatar
huffkw
Member
Posts: 54
Joined: Sun Jan 21, 2007 6:34 pm
Location: Spanish Fork, Utah
Contact:

Expanding on some points

Postby huffkw » Sat Jan 27, 2007 10:02 pm

Expanding on some points
After some thought, I would like to expand on a few points brought up earlier.
-------------------------------------------
thedqs: “the only problem is storage”

Actually, I fear we could be missing a great promotional opportunity here.

Note that if each user were to put in 5,000 names and each took 2,000 bytes of space, or 10 megabytes in all, that would cost only about one cent apiece for storage space. (At $1 per gigabyte, 10 MB costs about one penny).

Perhaps we should be giving away free genealogy workspace like we give away free copies of the Bible and Book of Mormon. At a penny a person for genealogy data storage space, that is far cheaper than giving away books. Yahoo and Google give away much more (2GB+) for free (to get people to use their site and view their ads), so we ought to be able to do something similar. Some nice ads about Book of Mormon topics could get them reading the book online. And who is going to write “The Great (Early) American Novel” on a Book of Mormon topic, or write the blockbuster game on Nephite battle tactics? If Mel Gibson can make money on a Mayan movie theme, why can’t we do something like that? We could advertise many such things on this site (assuming someone does the creative work).

Some calculations: If 10 million people used the free site, that would be a one-time cost of $100,000 for storage. Spread over a 5-year hardware life, that is $20,000 a year. That equals about 4 full-time missionaries for a year. Seems like a good cost/benefit trade for drawing 10 million people into closer contact with the Church. Eventually supporting a staggering (upper limit of) 50 million engaged users worldwide would cost about $100,000 a year, the same as 20 missionaries, but the positive public relations impact would be far more than 5 times larger than the 10 million participants.
--------------------------------------
Database value
What is the social value to the world of one 40 million-name, dynamic, integrated, genealogy database free of duplicates, and designed to grow endlessly in size, richness, and provable accuracy, while always staying free of duplicates? That value seems pretty high to me. The world would really clamor to use it, and would want to add its data to such a treasure.

It would look like one big integrated database to the outsider, even though it is actually cooperatively maintained in many smaller pieces inside. (Assume only deceased names can be seen without special permission from data owners.)

There is a large chance that many people would get on board and gradually grow the database to include 600 million Americans, living and dead, and 600 Europeans, living and dead. Then the Church would have created a new wonder of the world. (And these millions of non-Church participants would be doing much of the Church members’ genealogy research for them for free, since we have many of the same relatives.)
If someone wants to present a name or a family to the world, inviting anyone who has data on that person or family to add it (in one of several ways) to what is known already, this is the place to put it.

In pure monetary terms that sophisticated database would be worth many billions of dollars, so we can be sure it would be noticed and used. The Church might even figure out a way to recoup its operating costs.
--------------------------------------
haledn: “how to manage external sources when they become unavailable”

Three kinds of links needed
The central database needs to support three kinds of links.
1. Library references that are not hotlinks, but still tell you where the documents can be found, and are in standardized electronic database form so they can be examined for such things as percentage of a public record set that has been used/referenced, etc.
2. Hotlinks to Church source records. Those records should not move around much, so there is little likelihood the links will become broken.
3. The unique person number assigned within the database will allow anyone, anywhere to tie data to that person, with no updates required to the central database.
-------------------------------------
A simple client-side program to integrate all data on a person
If there are multiple sites on the web that contain data about one person (obituaries, photos, stories, videos, source images, etc.), a fairly simple client-side program (to begin with) could take the current list of such registered add-on data sites and pull out and assemble whatever they have to offer. These peer-to-peer operations would keep most of this kind of high volume data assembly activity out of the central site.

Note that for some simple HTML web sites, by using the unique person number, the indexing to that site could be done by Google.
--------------------------------------
Private firm involvement
I just realized that I have not suggested how private firms might get involved in processes related to this new database. I will have to write up something on that.


Return to “Family History”

Who is online

Users browsing this forum: No registered users and 1 guest