Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

is it okay to "repeatedly" xss-clean data in CodeIgniter?

The following are ways to XSS-clean data in Codeigniter:

  • set global_xss_filtering in config to TRUE
  • use xss_clean()
  • use xss_clean as a validation rule
  • set the second parameter to TRUE in $this->input->post('something', TRUE)

Is it okay to use all or more than one of them on one piece of data?

For example, would it be okay if I still used $this->input->post('something', TRUE) even if the data has already been cleaned by global_xss_filtering and xss_clean validation rule?

like image 311
Obay Avatar asked Mar 15 '12 06:03

Obay


2 Answers

It's not going to hurt you, but it is definitely is pointless.

There's a very good chance that eventually, you will reach a point where the global XSS filter is going to be cumbersome. Since it can't be disabled per controller without extensive hacks, and access to the raw $_REQUEST data will be impossible, you will need to disable it globally. This will happen the moment you want to process a single piece of trusted data, or data that isn't HTML output and must remain intact.

Using it as a form validation rule is pointless and potentially destructive as well. Imagine what this site would be like if every time you typed <script> it was replaced with [removed], with no way to revert it in the future. For another example, what if a user users some "XSS" content in his password? Your application will end up altering the input silently.

Just use the XSS filter where you need it: on your HTML output, places where javascript can be executed.

like image 155
Wesley Murch Avatar answered Oct 21 '22 16:10

Wesley Murch


Yes. Assume, your input is 'A'. Then, lets say you run an xss_clean to get XSS-safe content:

B = xss_clean(A)

Now, lets say I do it again to get C:

C = css_clean(B)

Now, if B and C differ, then it must mean that B had some xss-unsafe content. Which clearly means that xss_clean is broken as it did not clean A properly. So as long as you assume that the function returns xss-safe content, you are good to go.

One argument that can be made is what if the function modifies even xss-safe content? Well, that would suck and it would still mean that the function is broken, but that is not the case (saying just out of my experience, as in haven't seen it behave like this ever).

The only drawback I see is the additional processing overhead, but doing it twice is fine (once with global filtering, and once doing it explicitly, just in case global filtering is turned off sometime by someone), and is a pretty ok overhead cost for the security assurance.

Also, if I may add, codeigniters xss clean doesn't really parse the HTML and drop the tags and stuff. It just simply converts the < and > to &lt; and &gt;. So with that in mind, I don't see anything that could go wrong.

like image 4
Rohan Prabhu Avatar answered Oct 21 '22 16:10

Rohan Prabhu