Understanding Protocols in Elixir: A Comprehensive Guide
How Elixir protocols make polymorphism feel like design work. Less ceremony, more clarity, and a path away from giant conditionals.
A system works when every piece knows its role. In Elixir we don’t reach for base classes. We write protocols. Small agreements that let data tell us how it wants to behave. Treat them as design tools and the code scales without turning into a wall of conditionals.
This is a field note on using protocols the way I do on teams. I treat them as contracts between engineering and product intent.
The setup: one encoder for everything
Every API eventually needs to turn random data into JSON. Without protocols, you end up with a sprawling function that has to know about every type in the system:
defmodule JSONEncoder do
def encode(data) when is_binary(data), do: "\"#{data}\""
def encode(data) when is_integer(data), do: Integer.to_string(data)
def encode(data) when is_boolean(data), do: Atom.to_string(data)
def encode(data) when is_list(data), do: "[#{Enum.map_join(data, ",", &encode/1)}]"
# More and more clauses for every possible type...
end
It works, but it has problems:
- One module owns everything. Every new type means editing
JSONEncoderagain. - Outsiders can’t extend it. Third-party structs can’t plug in without forking.
- It knows too much. The function has to understand every domain concept to do its job.
- It gets long fast. The pattern matches stack up forever.
Protocols give us a calmer approach.
Protocols: contracts, not conditionals
Define the conversation once, let each type answer for itself:
# Step 1: Define the contract
defprotocol JSONEncodable do
@doc "Converts a value to its JSON string representation"
def to_json(value)
end
# Step 2: Implement for basic types
defimpl JSONEncodable, for: BitString do
def to_json(string), do: "\"#{String.replace(string, "\"", "\\\"")}\""
end
defimpl JSONEncodable, for: Integer do
def to_json(integer), do: Integer.to_string(integer)
end
defimpl JSONEncodable, for: Boolean do
def to_json(true), do: "true"
def to_json(false), do: "false"
end
defimpl JSONEncodable, for: List do
def to_json(list) do
items = Enum.map_join(list, ",", &JSONEncodable.to_json/1)
"[#{items}]"
end
end
Now it feels less like a switch and more like a conversation:
iex> JSONEncodable.to_json("hello")
"\"hello\""
iex> JSONEncodable.to_json(42)
"42"
iex> JSONEncodable.to_json(true)
"true"
iex> JSONEncodable.to_json([1, "hello", false])
"[1,\"hello\",false]"
The real payoff shows up when you need to add types you didn’t plan for:
# In a completely different module, even a different library:
defimpl JSONEncodable, for: Date do
def to_json(date) do
date
|> Date.to_iso8601()
|> JSONEncodable.to_json()
end
end
defimpl JSONEncodable, for: DateTime do
def to_json(datetime) do
datetime
|> DateTime.to_iso8601()
|> JSONEncodable.to_json()
end
end
defimpl JSONEncodable, for: Atom do
def to_json(nil), do: "null"
def to_json(atom) do
atom
|> Atom.to_string()
|> JSONEncodable.to_json()
end
end
Notice the pattern. Convert the data into something the protocol already knows, then delegate. No special cases needed.
iex> JSONEncodable.to_json(~D[2024-11-10])
"\"2024-11-10\""
Nothing in the original module changes. The protocol just gets extended for new types as they show up.
Maps: the foundation for structs
For the protocol to be actually useful, it needs to handle maps. Most complex data ends up there:
defimpl JSONEncodable, for: Map do
def to_json(map) when map == %{}, do: "{}"
def to_json(map) do
pairs =
map
|> Enum.map(fn {key, value} ->
key_json = JSONEncodable.to_json(to_string(key))
value_json = JSONEncodable.to_json(value)
"#{key_json}:#{value_json}"
end)
|> Enum.join(",")
"{#{pairs}}"
end
end
Now encoding nested data is straightforward:
iex> data = %{"name" => "Alice", "age" => 30, "active" => true}
iex> JSONEncodable.to_json(data)
"{\"name\":\"Alice\",\"age\":30,\"active\":true}"
Now your real domain types
Every product type eventually needs encoding. Instead of forcing structs through generic helpers, let each one describe itself:
defmodule User do
defstruct [:id, :name, :email, :role, :created_at]
end
defmodule Article do
defstruct [:id, :title, :content, :author_id, :published_at, :tags]
end
Now let’s implement JSON encoding for these domain types:
defimpl JSONEncodable, for: User do
def to_json(%User{id: id, name: name, email: email, role: role}) do
%{
id: id,
name: name,
email: email,
role: role
}
|> JSONEncodable.to_json()
end
end
defimpl JSONEncodable, for: Article do
def to_json(%Article{id: id, title: title, author_id: author_id, published_at: published_at, tags: tags}) do
%{
id: id,
title: title,
author_id: author_id,
published_at: published_at,
tags: tags
}
|> JSONEncodable.to_json()
end
end
The move here is to pick the fields you actually want to expose, convert to a map, and reuse the Map implementation. Composition beats special cases.
Putting it to work:
iex> user = %User{id: 1, name: "Alice", email: "[email protected]", role: "admin"}
iex> JSONEncodable.to_json(user)
"{\"id\":1,\"name\":\"Alice\",\"email\":\"[email protected]\",\"role\":\"admin\"}"
iex> article = %Article{id: 1, title: "Protocols Guide", author_id: 1, published_at: nil, tags: ["elixir", "tutorial"]}
iex> JSONEncodable.to_json(article)
"{\"id\":1,\"title\":\"Protocols Guide\",\"author_id\":1,\"published_at\":null,\"tags\":[\"elixir\",\"tutorial\"]}"
Validation, same idea
Protocols aren’t just for serialization. I use them for any cross-cutting behavior, validation included. Same idea. Define the contract, let each type own its rules.
Start with the protocol:
defprotocol Validatable do
@doc "Returns {:ok, value} if valid, {:error, errors} if invalid"
def validate(value)
end
defimpl Validatable, for: User do
def validate(%User{name: name, email: email} = user) do
[]
|> validate_name(name)
|> validate_email(email)
|> case do
[] -> {:ok, user}
errors -> {:error, Enum.reverse(errors)}
end
end
defp validate_name(errors, name) do
if valid_name?(name), do: errors, else: [{:name, "Name must be 2-50 characters"} | errors]
end
defp validate_email(errors, email) do
if valid_email?(email), do: errors, else: [{:email, "Email must be valid"} | errors]
end
defp valid_name?(name) when is_binary(name) do
String.length(name) >= 2 and String.length(name) <= 50
end
defp valid_name?(_), do: false
defp valid_email?(email) when is_binary(email) do
String.contains?(email, "@") and String.length(email) > 5
end
defp valid_email?(_), do: false
end
defimpl Validatable, for: Article do
def validate(%Article{title: title, content: content} = article) do
[]
|> validate_title(title)
|> validate_content(content)
|> case do
[] -> {:ok, article}
errors -> {:error, Enum.reverse(errors)}
end
end
defp validate_title(errors, title) do
if valid_title?(title), do: errors, else: [{:title, "Title must be 5-200 characters"} | errors]
end
defp validate_content(errors, content) do
if valid_content?(content), do: errors, else: [{:content, "Content must be at least 50 characters"} | errors]
end
defp valid_title?(title) when is_binary(title) do
len = String.length(title)
len >= 5 and len <= 200
end
defp valid_title?(_), do: false
defp valid_content?(content) when is_binary(content) do
String.length(content) >= 50
end
defp valid_content?(_), do: false
end
Each validation reads like a checklist. You accumulate errors as you go and keep them structured. The conditionals stay out of your controllers.
Using the protocol:
iex> user = %User{name: "Alice", email: "[email protected]", role: "admin"}
iex> Validatable.validate(user)
{:ok, %User{name: "Alice", email: "[email protected]", role: "admin"}}
iex> bad_user = %User{name: "A", email: "bad-email"}
iex> Validatable.validate(bad_user)
{:error, [name: "Name must be 2-50 characters", email: "Email must be valid"]}
# Now you can easily check specific field errors:
iex> {:error, errors} = Validatable.validate(bad_user)
iex> Keyword.get(errors, :name)
"Name must be 2-50 characters"
iex> Keyword.has_key?(errors, :email)
true
Keyword lists make downstream consumers happier. Forms, APIs, analytics tooling, all of it.
- Field-specific access:
Keyword.get(errors, :email) - Conditional logic:
Keyword.has_key?(errors, :name) - UI wiring: map errors directly to inputs
- Analytics: count errors by field, not raw strings
- Partial retries: re-validate only what failed
For nested data, just keep composing:
# Complex validation with nested errors
def validate_nested_data(data) do
[]
|> validate_user_info(data.user)
|> validate_preferences(data.preferences)
|> validate_billing(data.billing)
|> format_errors()
end
# Group errors by section
defp format_errors([]), do: {:ok, data}
defp format_errors(errors) do
grouped = Keyword.group_by(errors, &elem(&1, 0), &elem(&1, 1))
{:error, grouped}
end
Now you can stitch together complex payloads and still keep things readable:
iex> complex_data = %{
...> "user" => %User{id: 1, name: "Alice", email: "[email protected]", role: "admin"},
...> "metadata" => %{"timestamp" => ~U[2024-11-10 14:30:00Z], "version" => 1}
...> }
iex> JSONEncodable.to_json(complex_data)
# Returns complete JSON with proper nesting and type conversion
The standard library already does this
Elixir does this everywhere:
String.Charspowers string interpolation.Inspectkeeps debugging readable.Enumerablelets any collection participate inEnum.Collectableworks the other way, building collections from streams.
Implement those for your own structs and the rest of the Elixir ecosystem can work with your data without extra wiring.
When not to spin up a protocol
Sometimes a protocol is overkill:
- If the transformation is purely internal, plain functions are clearer.
- If only one type ever uses the behavior, a protocol adds ceremony for nothing.
- If every type behaves identically, a helper module is enough.
- If the contract is going to keep changing, plain functions are easier to refactor.
Design heuristics I keep handy
- Solve a real problem. Protocols should remove friction people already feel.
- Keep the contract small. One responsibility, well documented.
- Pick clear names. Long names beat clever abbreviations.
- Help your future self. Leave docstrings that explain why the protocol exists.
Closing the loop
Protocols are the quiet handshake that keeps Elixir code flexible. Each type voices what it needs and the system listens. Nobody is editing a giant conditional just to add a new type.
Find the function in your codebase that knows too much. Turn it into a protocol. The rest of the system gets a lot easier to work with.
About the author
More like this
Elegant Error Handling in Phoenix LiveView with Ash
How to build 404, 403, and 500 pages in Phoenix LiveView that do not feel like afterthoughts. Uses Ash exceptions and the built-in Phoenix conventions so the failure paths read like the rest of your product.