Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Exposing table name and field names in request URL

I was tasked to create this Joomla component (yep, joomla; but its unrelated) and a professor told me that I should make my code as dynamic as possible (a code that needs less maintenance) and avoid hard coding. The approach we thought initially is take url parameters, turn them into objects, and pass them to query.

Let's say we want to read hotel with id # 1 in the table "hotels". lets say the table has the fields "hotel_id", "hotel_name" and some other fields.

Now, the approach we took in making the sql query string is to parse the url request that looked like this:

index.php?task=view&table=hotels&hotel_id=1&param1=something&param2=somethingelse

and turned it into a PHP object like this (shown in JSON equivalent, easier to understand):

obj = {
  'table':'hotel',
  'conditions':{
        'hotel_id':'1',
        'param1':'something',
        'param2':'somethingelse'
}

and the SQL query will be something like this where conditions are looped and appended into the string where field and value of the WHERE clause are the key and value of the object (still in JSON form for ease):

SELECT * FROM obj.table WHERE hotel_id=1 AND param1=something and so on...

The problem that bugged me was the exposing of the table name and field names in the request url. I know it poses a security risk exposing items that should only be seen to the server side. The current solution I'm thinking is giving aliases to each and every table and field for the client side - but that would be hard coding, which is against his policy. and besides, if I did that, and had a thousand tables to alias, it would not be practical.

What is the proper method to do this without:

  1. hard coding stuff
  2. keep the code as dynamic and adaptable

EDIT:

Regarding the arbitrary queries (I forgot to include this), what currently stops them in the back end is a function, that takes a reference from a hard-coded object (more like a config file shown here), and parses the url by picking out parameters or matching them.

The config looks like:

// 'hotels' here is the table name. instead of parsing the url for a table name
// php will just find the table from this config. if no match, return error.
// reduces risk of arbitrary tables.

'hotels' => array(      

  // fields and their types, used to identify what filter to use

  'structure' => array(  
    'hotel_id'=>'int',
    'name'=>'string',
    'description'=>'string',
    'featured'=>'boolean',
    'published'=>'boolean'
  ),

   //these are the list of 'tasks' and accepted parameters, based on the ones above
   //these are the actual parameter names which i said were the same as field names
   //the ones in 'values' are usually values for inserting and updating
   //the ones in 'conditions' are the ones used in the WHERE part of the query

  'operations' =>array(  
    'add' => array(
      'values' => array('name','description','featured'),
      'conditions' => array()
    ),
    'view' => array(
    'values' => array(),
    'conditions' => array('hotel_id')
    ),
    'edit' => array(
    'values' => array('name','description','featured'),
    'conditions' => array('hotel_id')
    ),
    'remove' => array(
    'values' => array(),
    'conditions' => array('hotel_id')
    )
  )
)

and so, from that config list:

  • if a parameters sent for a task is not complete, server returns an error.
  • if a parameter from the url is doubled, only the first parameter read is taken.
  • any other parameters not in the config are discarded
  • if that task is not allowed, it wont be listed for that table
  • if a task is not there, server returns an error
  • if a table is not there, server returns an error

I actually patterned this after seeing a component in joomla that uses this strategy. It reduces the model and controller to 4 dynamic functions which would be CRUD, leaving only the config file to be the only file editable later on (this was what I meant about dynamic code, I only add tables and tasks if further tables are needed) but I fear it may impose a security risk which I may have not known yet.

Any ideas for an alternative?

like image 935
Joseph Avatar asked Jul 24 '11 14:07

Joseph


1 Answers

I have no problem with using the same (or very similar) names in the URL and the database — sure, you might be "exposing" implementation details, but if you're choosing radically different names in the URL and the DB, you're probably choosing bad names. I'm also a fan of consistent naming — communication with coders/testers/customers becomes much more difficult if everyone calls everything something slightly different.

What bugs me is that you're letting the user run arbitrary queries on your database. http://.../index.php?table=users&user_id=1, say? Or http://.../index.php?table=users&password=password (not that you should be storing passwords in plaintext)? Or http://.../index.php?table=users&age=11?

If the user connected to the DB has the same permissions as the user sitting in front of the web browser, it might make sense. Generally, that's not going to be the case, so you'll need some layer that knows what the user is and isn't allowed to see, and that layer is a lot easier to write correctly by whitelisting.

(If you've stuck enough logic into stored procedures, then it might work, but then your stored procedures will hard-code column names...)

like image 100
tc. Avatar answered Oct 31 '22 04:10

tc.