I have created data sets of various sizes say 1GB, 2GB, 3GB, 4GB (< 10 GB) and executing various machine learning models on Azure ML.
1) Can I know what is the server specifications (RAM, CPU) that is provided in the Azure ML service.
2) Also at times the reader says "Memory exhaust" for >4GB of data.Though azure ml should be able to handle 10GB of data as per documentation.
3) If I run multiple experiments(in different tabs of browser) in parallel, its taking more time.
4) Is there any way to set the RAM, CPU cores in Azure ML
I have a partial answer: 1. no, it's abstracted
The following types of data can expand into larger datasets during feature normalization, and are limited to less than 10 GB:
Sparse Categorical Strings Binary data
(see this)
I'm not sure, but while working on it, I didn't experience any change when running a single experiment and multiple experiment
you can scale the machines in the standard tier (see this)
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With