Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Internal REST API

We currently have three sites in our network. One written in Ruby on Rails and the other two written in PHP. All of the sites tend to share a lot of the same data and logic. I find myself having to repeat a lot of the work I do on the rails side on the PHP side. Seems like we need a common internal API to consolidate this. I've never built an API before and I have a few questions.

  • Performance If I build the API as a separate application, it seems like this is going to be twice as slow. As it has to go through the entire request/response cycle on the API end and then again on the public application side. Is there a way to make this faster? Or maybe a different approach?

  • API access via local network How would I access the API via the local network? Would I setup a virtualhost in Apache that points to 127.0.0.1?

  • Active Resource In my case (on the rails end) is ActiveResource the best way to go or are there better options for consuming the API? I'm also wondering how validations will work on the public side. Does ActiveResource reuse the validation rules or will I have to recreate them on the public side?

  • API Security I'm thinking that I won't have to worry too much about this right now since the API can only be accessed (ideally) via the local network. Am I correct in this assumption?

like image 722
Dan Avatar asked Nov 04 '22 06:11

Dan


1 Answers

There are whole books that have tackled this particular topic. Dealing with managing latency, stability, flexibility and all the other -ilities when creating services.

From a very general standpoint, and the relatively simple use case that it sounds like you have, I would recommend simple, REST based APIs in each application. You definitely don't want to repeat code in PHP that is already written in Ruby, and vice versa. Response times wont very too much between competing HTTP based query methods (not as drastically as it would when talking about something like HTTP vs CORBA, anyways). Writing up REST resources is what Ruby on Rails is good at, so you pretty much have that as a gimme. PHP is a bit harder, and you just have to structure your queryable API in such a way that it follows REST standards. After that, you just need an HTTP Client to do requests against either client. If you have well defined end points for each application, that shouldn't be too much of a problem to hard code them. If not, there's another whole design pattern around enterprise service buses that help one service find another, but doesn't generally have cross platform capability (at least, not PHP to-and-from Rails-wise-- That I am aware of, that is!).

For the Ruby/Rails world, I can recommend HTTParty or Typheous as HTTP Clients (that will query the REST client of the other application). For the PHP world, well, you may want to peek at this thread. It has a few listed.

To be clear, this isn't your only option- You could create full on web services, and even go deeper yet by creating direct socket connections, et al. It really all depends on just how tightly coupled you want the systems to be, how tightly coupled they can be, how close they are (in a "network topology" sense), and how fast of a response each system has to give.

Regarding Security: One option would be to set up your firewall on each system to only accept connections going to a particular resources if they're from a particular IP. You could follow a similar pattern at an application level. Though you might have just as much luck with securing a standard HTTPS session with basic/digest auth.

Hope this helps. I could go on about this for much longer, but I don't know that there's any good way to give one specific answer to this general of a question. If there is something I can expand on, feel free to comment and I will as I can.

like image 167
Christopher WJ Rueber Avatar answered Nov 15 '22 06:11

Christopher WJ Rueber