Is there a way to query the COUNT of all the POSTS created YESTERDAY in FQL? I know how to do this in SQL but it's not clear in FQL.
Yes, there is!
FQL as a query language is a lot more like a programming language than SQL, because it has many functions that perform specific tasks, and it operates in a way that lets you combine functions in a composable way. So, we can tackle the problem by breaking it down into smaller problems, solving those problems, and then assembling the solutions into a single query.
The first problem that is easy to tackle is YESTERDAY. Rather than a specific date, it’s always one day before the current day. How do we get the current day? In fauna-shell
(or via the Web Shell in the Fauna Console: https://dashboard.fauna.com), you can run this query:
> ToDate(Now())
Date("2019-12-13”)
So, how do we get YESTERDAY?
> TimeSubtract(ToDate(Now()), 1, "day")
Date("2019-12-12")
Our posts likely have a post date that is a string, so let’s make that a string:
> ToString(TimeSubtract(ToDate(Now()), 1, "day"))
'2019-12-12'
Next problem, finding posts with a specific post date. Unlike most SQL databases, FaunaDB does not perform full table scans to locate records, so searching and sorting require an index.
But first, in case you don’t already have one, let’s create a “posts” collection to store all of our post documents:
> CreateCollection({ name: "posts" })
{ ref: Collection("posts"),
ts: 1576256260580000,
history_days: 30,
name: 'posts' }
Then we can create the index:
> CreateIndex({
name: "posts_by_date",
source: Collection("posts"),
terms: [{ field: ["data", "postdate"]}],
values: [{field: ["data", "postdate"]}, {field: ["ref"]}]
})
{ ref: Index("posts_by_date"),
ts: 1576256995310000,
active: false,
serialized: true,
name: 'posts_by_date',
source: Collection("posts"),
terms: [ { field: [ 'data', 'postdate' ] } ],
values: [ { field: [ 'data', 'postdate' ] }, { field: [ 'ref' ] } ],
partitions: 1 }
This index is sourced from the “posts” collection, specifies the “postdate” field as a search term, and for matching records, returns the “postdate” and “ref” field values. “ref” is the primary key field for all documents in FaunaDB.
Before we search for posts, let’s create a few:
> Create(Collection("posts"), { data: { title: "Last week", postdate: "2019-12-06" }})
{ ref: Ref(Collection("posts"), "251656374064251392"),
ts: 1576257070530000,
data: { title: 'Last week', postdate: '2019-12-06' } }
> Create(Collection("posts"), { data: { title: "Yesterday", postdate: "2019-12-12" }})
{ ref: Ref(Collection("posts"), "251656397874266624"),
ts: 1576257093220000,
data: { title: 'Yesterday', postdate: '2019-12-12' } }
> Create(Collection("posts"), { data: { title: "Today", postdate: "2019-12-13" }})
{ ref: Ref(Collection("posts"), "251656411087372800"),
ts: 1576257105810000,
data: { title: 'Today', postdate: '2019-12-13' } }
So now we have three posts. Let’s try to search for one by date. We’ll use Paginate
to retrieve all of the index entries that match:
> Paginate(Match(Index("posts_by_date"), "2019-12-12"))
{ data:
[ [ '2019-12-12', Ref(Collection("posts"), "251656397874266624") ] ] }
In that query, we manually specified yesterday’s date. Let’s replace the manual date with the computed date:
> Paginate(Match(Index("posts_by_date"), ToString(TimeSubtract(ToDate(Now()), 1, "day"))))
{ data:
[ [ '2019-12-12', Ref(Collection("posts"), "251656397874266624") ] ] }
Instead of showing the results, let’s count them:
> Count(Match(Index("posts_by_date"), ToString(TimeSubtract(ToDate(Now()), 1, "day"))))
1
What happens if we add another post with yesterday’s date?
> Create(Collection("posts"), { data: { title: "Another Yesterday", postdate: "2019-12-12" }})
{ ref: Ref(Collection("posts"), "251657749479293440"),
ts: 1576258382210000,
data: { title: 'Another Yesterday', postdate: '2019-12-12' } }
> Count(Match(Index("posts_by_date"), ToString(TimeSubtract(ToDate(Now()), 1, "day"))))
2
And there we have it, a count of all of the posts posted YESTERDAY.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With