I am trying to design a data model which can hold a very large amount of data, does anyone with experience in large volumes of data have any feedback on this, ie:
// example only, not meant to compile
public class TransactionAccount {
private long balance;
private List<Transaction> transactions = new ArrayList<Transaction>();
....
public long getBalance() { return balance; }
}
private class Transaction {
public Date date;
public long amount;
}
Based on what I have read, the only way to get transactional integrity on inserting a Transaction
and updating balance
is to make it one entity group.
However over time there would be millions of transactions for a particular TransactionAccount
. The number of writes to this entity group would be low, but the reads would be much higher.
I know it could possibly be sharded, however reading the balance
is a very frequent operation, and sharding it would make one of the most common operations getBalance()
the slowest operation.
The arrangement you describe should work fine. If your entity group grows excessively big (we're talking hundreds of megabytes of transactions before this becomes an issue), you could write a procedure to 'roll up' old transactions: transactionally replace a set of old transaction records with a single one for the sum of those transactions, in order to maintain the invariant that the balance is equal to the sum of all transactions. If you still need to store a record of these old, 'rolled up' transactions, you can make a copy of them in a separate entity group before you perform the roll-up.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With