Skip to content

Conversation

@s0meguy1
Copy link

Hi,

I found that on certain sites, I get the error:

[?] Testing all provided paths to check to CRLF injection. This is specially interesting if the site uses S3 buckets or GCP to host files
Traceback (most recent call last):
  File "/root/nginxpwner/./nginxpwner.py", line 226, in <module>
    uri_crlf_test= requests.get(f"{url}/{pathline.strip()}%0d%0aDetectify:%20clrf", verify=False)
  File "/usr/local/lib/python3.10/dist-packages/requests/api.py", line 75, in get
    return request('get', url, params=params, **kwargs)
  File "/usr/local/lib/python3.10/dist-packages/requests/api.py", line 61, in request
    return session.request(method=method, url=url, **kwargs)
  File "/usr/local/lib/python3.10/dist-packages/requests_raw/__init__.py", line 29, in __request
    return _request(self, method, url, *args, **kwargs)
  File "/usr/local/lib/python3.10/dist-packages/requests/sessions.py", line 542, in request
    resp = self.send(prep, **send_kwargs)
  File "/usr/local/lib/python3.10/dist-packages/requests/sessions.py", line 655, in send
    r = adapter.send(request, **kwargs)
  File "/usr/local/lib/python3.10/dist-packages/requests/adapters.py", line 439, in send
    resp = conn.urlopen(
  File "/usr/local/lib/python3.10/dist-packages/urllib3/connectionpool.py", line 627, in urlopen
    parsed_url = parse_url(url)
  File "/usr/local/lib/python3.10/dist-packages/urllib3/util/url.py", line 394, in parse_url
    return six.raise_from(LocationParseError(source_url), None)
  File "<string>", line 3, in raise_from
urllib3.exceptions.LocationParseError: Failed to parse: //dist%0D%0ADetectify:%20clrf

I didn't dig into why this was happening, I made a quick fix and added some error correction to line 229

Fixed "Failed to parse" issue
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

1 participant