Just a bit of info that some users may find useful regarding duplicates.
I have always done it the (hard way?) by using loops and variables but looking at BD web site there is an example file "Duplicates" which appears to work ok. But I wanted to restore the unique records back to main folder - I thought a move action at end of query would do this but error warnings resulted what ever I tried. Finally found the following actions worked when attached to toolbar button.
Execute Query (Get Uniq Records)
Select Folder (All Recs)
Select All Records (Uniq)
Move Record(s) (All Recs, Folder "Uniq" content)
Hope this may be of use.
Had add if statement to query "end script" to stop warning if no duplicates.
If queRecN(|Get Uniq Records|) > "0" then
Move Record(s) (Recycle Bin, Folder "All Recs" content)
Last edited by tamcind; 12-13-14 at 06:20.
Reason: More info
About four months ago, I regrettably discovered that I had received some 'tainted' information in my project. The person at the hospital that was tasked with generating the reports I needed had inadvertently included some of the records twice. At the time I made this discovery, my project had about 108K records (a bit above the 100K max as stated by Brilliant Database). Approximately 2,500 records were duplicates and I had no idea on how to systematically and efficiently delete the duplicates in Brilliant Database... so I exported these to EXCEL, sorted this to only have unique records, deleted all the original records in Brilliant Database, and imported the now unique records back to Brilliant Database. This process, as you easily imagine, took a considerable amount of time and effort to do (plus Brilliant Database does not seem to like the deletion of a large volume of records!).
A get a star for this idea! Your method certainly sounds much, much simpler. As a favor could I ask that you post this EXCELLENT suggestion in the 'Best Practices sticky thread'?
Yes may move to tips folder but would like to experiment a little more. Although it seems to work ok the thing that worries me is that moving large groups of records between folders may fragment database.Testing a similar idea (using the "unique" option with no search parameters in query editor) but marking wanted records with a checkbox (true/false) in query actions for each record then another query to move unmarked records to recycle bin. Seems to work ok but will check on a large set (100,000?) - was fast on 1000 or so.
Just that is pays to "experiment" with BD just in case there is a unforeseen "side effect" as many members have found previously.
Seasons greetings to everyone