 630 Points
Posted 1 month ago
Robert Patten, Employee
 640 Points
 630 Points
There are about 38K in total across all tables.
I have 100K unique values in the lookup file.
Regards
 910 Points
 630 Points
 910 Points
https://thedatalobby.kuzodata.com/maximumperformancedatamasking/
However, it's probably worth testing it to see whether it fixes your issue and how it performs.
 630 Points
This cannot be the algorithm going forward, just wondering why such an issue came up with the current ample of values.
The only thing to add is that, I have multiple copies of the same DB using this algorithm when masked.
Regards
 5,846 Points
Hi,
If you are using lookup file this means that you are not
using segmented mapping algorithm but secure lookup one instead.
Notice that it's more accurate to use segmented mapping for
primary / foreign keys columns it will guarantee you the uniqueness value
generation.
Can you show an example of the values you have in your column
that we can help.
Regards,
Mouhssine
 630 Points
I am using mapping algorithm, not segmented mapping or secure lookup.
regards
 5,846 Points
HI,
Sorry for the misunderstanding.
So the rule should be number of unique values in the mapping
file have to be equal or superior to the column once to guarantee that
uniqueness values.
Can you confirm pleas if this is the case with you mapping file.
Regards,
Mouhssine
 630 Points
 5,846 Points
Hi,
Just to complete the answer.
I'm agree with Math's remarks the making job have to load on
memory the 100K entries in mapping file before proceeding with masking and this
spins up Performance / Maintenance questions.
You will have to update the mapping file whenever it's needed
to guaranty it have equal or more values than what you have in your columns.
I still think that the easiest and smart way is to use segmented
mapping algorithm, is there any particular reason for not using it in your case
Regards,
Mouhssine
 630 Points
The lookup file provides realistic values, while the segmented will be haphazard, random ones which is needed in our case.
The issue is not which Algorithm being used, it's simply why is it asking for more values while it has more than enough, more than double unique values available.
I am versed with other provisions and performance issues that exist.
Regards.
 5,846 Points
Hi,
I've got the point, could you give it try with SL algorithm
you will have to generate a file with >= to the number of entries in the
column.
May be the hick is with how the algorithm is working, and the
idea is try with a simple replacement with guarantee of preserving the
realistic value meaning.
Regards,
Mouhssine
 5,846 Points
Have you defined any ignore character at the algorithm level, may be this influences the results.
https://support.delphix.com/Unpublished_Articles/KBA1328_Mapping_Algorithm_(MA)_Technical_Overview
Regards,
Mouhssine
 630 Points
There are no ignored chars. Mapping Algorithm ticks all the boxes in our case.
It's just this anomaly regarding not enough lookup values while there are plenty, we are facing.
Regards
Robert Patten, Employee
 640 Points
Related Categories

Delphix Masking
 217 Conversations
 67 Followers