I have not had consistent results when deleting entities in an Azure table when the deletion occurs in a “foreach” loop. PartitionKey is unique.
Is there a best practice that folks can recommend?
I have left out Try..Catch statements for brevity.
Assuming:
var context = new FooContext(_storageAccount);
var query = (from e in context.TableBar
where e.RowKey == rowKey
select e).AsTableServiceQuery();
Method #1
foreach (var entity in query.Execute())
{
// delete each entity
context.DeleteObject(entity);
context.SaveChanges();
}
Method #2
foreach (var entity in query.Execute())
{
// delete each entity
context.DeleteObject(entity);
}
context.SaveChanges();
Method #3
var bars = query.Execute();
foreach (var bar in bars)
context.DeleteObject(bar);
context.SaveChanges();
Method #1 appears to delete most entities, but generally the last entity does not delete. It’s a valid entity.
If it is important for you to keep using the table, the only other option is to delete entities. For faster deletes, you can look at deleting entities using Entity Batch Transactions . But for deleting entities, you would need to first fetch the entities.
The row key is a unique identifier for an entity within a given partition. Together the PartitionKey and RowKey uniquely identify every entity within a table.
Azure Blob Storage TiersThe blob storage option is not persistent, as opposed to other Azure storage options like hard disks of Infrastructure-as-a-Service (IAAS) or VMs. As a result, you have to use persistent stores like tiers for long-term storage of files. There are three types of storage tiers.
you can do the same by using azure storage explorer. You can export the table of choice, delete the property in the CSV file and import it back in new table. drop the existing table and rename the new table to the existing one.
If you already know the PartitionKey and RowKey of an entity you don't have to load it first. The fastest way is to use the CloudTable directly and a DynamicTableEntity like this:
var cloudTable = cloudTableClient.GetTableReference("TheTable");
var entity = new DynamicTableEntity(partitionKey, rowKey) { ETag = "*" };
cloudTable.Execute(TableOperation.Delete(entity));
If you want to delete a collection of entities which you don't know the RowKeys then you will need to load the entities first and use a Batch Operation to delete the entities. Also, you don't need to load all properties of the entities, we just need to know the RowKey so we can again use the same technique as above. For that we use a Projection Query
var batchOperation = new TableBatchOperation();
// We need to pass at least one property to project or else
// all properties will be fetch in the operation
var projectionQuery = new TableQuery<DynamicTableEntity>()
.Where(TableQuery.GenerateFilterCondition("PartitionKey",
QueryComparisons.Equal, "ThePartitionKey"))
.Select(new string[] { "RowKey" });
foreach (var e in table.ExecuteQuery(projectionQuery))
batchOperation.Delete(e);
table.ExecuteBatch(batchOperation);
A word of caution: a Batch Operation allows a maximum 100 entities in the batch which must share the same PartitionKey so you may need to split the entities into proper batches for this to work.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With