I have a Laravel 5.8 (also tested with Laravel 6.1.0) job that periodically downloads a file from the NOAA Spacer Weather Prediction Centre. As part of the job, I want to validate some of the attributes before inserting them into the database and fail the job if things aren't right.
The SWPC file is a JSON file that contains an array of objects that looks like this:
[{
time_tag: "2019-10-02T13:39:00",
bt: 4.45,
bz_gsm: 0.09,
// (there are lots of other, irrelevant fields here)
}, {
// (more of the same data here, repeated dozens of times)
}]
The main part of my job looks like this:
$request = $guzzle->request('GET', 'https://services.swpc.noaa.gov/json/rtsw/rtsw_mag_1m.json');
$imf = collect(json_decode($request->getBody()->__toString()));
Validator::make($imf->all(), [
'*.time_tag' => 'required|date', // Must be a valid date
'*.bt' => 'required', // Bt must be present
'*.bz_gsm' => 'required', // Bz must be present
])->validate();
dd($imf->first());
According to the documentation, calling validate()
on a manually created Validator should act the same as if you validated a request object (that is: "If validation fails, the user will automatically be redirected or, in the case of an AJAX request, a JSON response will be returned")
However with the above code, validation passes, but the time_tag
, bt
and bz_gsm
fields are null. If I remove the time_tag
validation rule, time_tag
is no longer null (but bt
and bz_gsm
are null). Same with the bt
and bz_gsm
fields. Basically any field that gets validated, is nulled.
If I change my validation code to this:
$validator = Validator::make($imf->all(), [
'*.time_tag' => 'required|date',
'*.bt' => 'required|file', // Bt is not an uploaded file! Validation should fail.
'*.bz_gsm' => 'required',
]);
dd($validator->fails());
Then fails
is false
(i.e., validation passes). If I dd($imf->first())
, then once again time_tag
, bt
and bz_gsm
are null, while every other element in $imf
is the original value.
As far as I'm aware, my matches are correct, because dd($imf->all())
before, gives a lot of output that looks like this:
array:2031 [
0 => {#922
+"time_tag": "2019-10-02T23:55:00"
+"active": true
+"source": "ACE"
+"range": null
+"scale": null
+"sensitivity": null
+"manual_mode": false
+"sample_size": 60
+"bt": 3.48
+"bx_gse": -3.13
+"by_gse": 0.58
+"bz_gse": 1.4
+"theta_gse": 23.73
+"phi_gse": 169.57
+"bx_gsm": -3.13
+"by_gsm": 0.9
+"bz_gsm": 1.23
+"theta_gsm": 20.65
+"phi_gsm": 164.0
+"max_telemetry_flag": 0
+"max_data_flag": 0
+"overall_quality": 0
}
// ... a few thousand more lines of output here
while dd($imf->all())
after validation gives me:
array:2032 [
0 => {#922
+"time_tag": null
+"active": true
+"source": "ACE"
+"range": null
+"scale": null
+"sensitivity": null
+"manual_mode": false
+"sample_size": 60
+"bt": null
+"bx_gse": -3.16
+"by_gse": 0.94
+"bz_gse": 1.2
+"theta_gse": 20.01
+"phi_gse": 163.43
+"bx_gsm": -3.15
+"by_gsm": 1.2
+"bz_gsm": null
+"theta_gsm": 15.69
+"phi_gsm": 159.09
+"max_telemetry_flag": 0
+"max_data_flag": 0
+"overall_quality": 0
}
// ... a few thousand more lines of output here
So it's finding the keys it needs, and is just overwriting those while leaving the rest of the info.
Is there some hidden trick to validating arrays of data, or validating data in a Laravel job?
You can see in your dd()
that what you're passing to the validator is an array of stdObjects (which is the default for json_decode()
when it sees {}
). However, the '*.blah'
notation for the validator is actually for validating an array of arrays. Apparently, you have discovered that you get strange behavior when passing it objects instead or arrays.
The simple fix is to pass a second argument of true
to json_decode()
, which will convert the objects to associative arrays so that the Validator can parse them properly.
$imf = collect(json_decode($request->getBody()->__toString(), true));
Note: this also changes the structure of $imf
from what you had; any processing you do with it after validation will have to be appropriately adjusted.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With