Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Timing test on azure ml

I have created data sets of various sizes say 1GB, 2GB, 3GB, 4GB (< 10 GB) and executing various machine learning models on Azure ML.

1) Can I know what is the server specifications (RAM, CPU) that is provided in the Azure ML service.

2) Also at times the reader says "Memory exhaust" for >4GB of data.Though azure ml should be able to handle 10GB of data as per documentation.

3) If I run multiple experiments(in different tabs of browser) in parallel, its taking more time.

4) Is there any way to set the RAM, CPU cores in Azure ML

like image 811
Naseer Avatar asked Oct 31 '22 08:10

Naseer


1 Answers

I have a partial answer: 1. no, it's abstracted

  1. The following types of data can expand into larger datasets during feature normalization, and are limited to less than 10 GB:

    Sparse Categorical Strings Binary data

(see this)

  1. I'm not sure, but while working on it, I didn't experience any change when running a single experiment and multiple experiment

  2. you can scale the machines in the standard tier (see this)

like image 134
marnun Avatar answered Nov 15 '22 07:11

marnun