I have a kubernetes (0.15) cluster running on CoreOS instances on Amazon EC2
When I create a service that I want to be publicly accessible, I currently add some private IP addresses of the EC2 instances to the service description like so:
{
"kind": "Service",
"apiVersion": "v1beta3",
"metadata": {
"name": "api"
},
"spec": {
"ports": [
{
"name": "default",
"port": 80,
"targetPort": 80
}
],
"publicIPs": ["172.1.1.15", "172.1.1.16"],
"selector": {
"app": "api"
}
}
}
Then I can add these IPs to an ELB load balancer and route traffic to those machines.
But for this to work I need to have a maintain the list of all the machines in my cluster in all the services that I am running, which feels wrong.
What's the currently recommended way to solve this?
(I know of createExternalLoadBalancer
, but that does not seem to support AWS yet)
If someone will reach this question then I want to let you know that external load balancer support is available in latest kubernetes version.
Link to the documentation
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With