I've searched the Elixir and Phoenix docs, as well as a few other sites like Learn Elixir with no luck. Here is what it looks like:
defp update_positions(item_ids) do
item_ids = String.split(item_ids, ",")
|> Enum.map fn item_id -> String.to_integer(item_id) end
items = Repo.all(Item |> where([item], item.id in array(^item_ids, :integer)))
item_hash = Enum.reduce items, %{}, fn item, map -> Map.put(map, item.id, item) end
item_ids
|> Stream.with_index
|> Enum.each fn {item_id, index} ->
item = item_hash[item_id]
Repo.update(%{item | position: index + 1})
end
end
At first I thought it was just a line continuation symbol to keep code readable, but the Item |> where
line above suggests otherwise. Is it a list comprehension or something specifying input types?
This is the pipe operator. From the linked docs: This operator introduces the expression on the left-hand side as the first argument to the function call on the right-hand side. Examples.
The @ symbol in Elixir denotes module attributes, which are useful compile-time settings. You often see them in places where you might put class constants in an OO language.
Starting in elixir 1.2 there is an i command in iex that will list the type and more of any Elixir variable. If you look in the code for the i command you'll see that this is implemented via a Protocol.
I'll copy from my Elixir Express workshop material: https://github.com/chrismccord/elixir_express/blob/master/basics/06_pipeline_operator.md
One of the most simple, yet effective features in Elixir is the pipeline operator. The pipeline operator solves the issue many functional languages face when composing a series of transformations where the output from one function needs passed as the input to another. This requires solutions to be read in reverse to understand the actions being performed, hampering readability and obscuring the true intent of the code. Elixir elegantly solves this problem by allowing the output of a function to be piped as the first parameter to the input of another. At compile time, the functional hierarchy is transformed into the nested, "backward" variant that would otherwise be required.
iex(1)> "Hello" |> IO.puts
Hello
:ok
iex(2)> [3, 6, 9] |> Enum.map(fn x -> x * 2 end) |> Enum.at(2)
18
To grasp the full utility the pipeline provides, consider a module that fetches new messages from an API and saves the results to a database. The sequence of steps would be:
Without Pipeline:
defmodule MessageService do
...
def import_new_messages(user_token) do
Enum.each(
parse_json_to_message_list(
fetch(find_user_by_token(user_token), "/messages/unread")
), &save_message(&1))
end
...
end
Proper naming and indentation help the readability of the previous block, but its intent is not immediately obvious without first taking a moment to decompose the steps from the inside out to grasp an understanding of the data flow.
Now consider this series of steps with the pipeline operator:
With Pipeline
defmodule MessageService do
...
def import_new_messages(user_token) do
user_token
|> find_user_by_token
|> fetch("/messages/unread")
|> parse_json_to_message_list
|> Enum.each(&save_message(&1))
end
...
end
Piping the result of each step as the first argument to the next allows allows programs to be written as a series of transformations that any reader would immediately be able to read and comprehend without expending extra effort to unwrap the functions, as in the first solution.
The Elixir standard library focuses on placing the subject of the function as the first argument, aiding and encouraging the natural use of pipelines.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With