* Attempt 2 of cert validation fixes
* Remove unused code
* Cleanup the tmp cert using atexit
* Fix linting issues
* Only add SSLValidationHandler when not HAS_SSLCONTEXT
* Catch value errors on non PEM certs
* Only catch NotImplementedError to avoid masking issues
* set self._context even with PyOpenSSLContext for conformity
* Fix error building
* normalize how we interact with the context we create
* Remove unused code
* Address test for py3.7 message difference
* open_url should pass the ca_path through
* Account for new error in url lookup test
* Guard some code behind whether or not we are validating certs
* Make _make_context public
* Move atexit.register up to where the tmp file is created
- Also return url and update docs for other values to indicate they are only returned on success.
- Add integration tests
- Use info variable for common return values
- Use -1 as default status rather than None. This is lines up with with existing code in urls.py
- Add unit tests to ensure status and url are returned on failure
* uri/win_uri: Make method a free text field
Since various interfaces introduce their own HTTP method (e.g. like
PROPFIND, LIST or TRACE) it's better to leave this up to the user.
* Fix HTTP method check in module_utils urls
* Add integration test for method UNKNOWN
* Clarify the change as requested during review
* First pass at allowing unix socket with urls/uri. See #42341
* Only insert handler as needed
* Fix and add tests
* Add HTTPS functionality for unix sockets
* Additional test fixes
* Create context manager for monkey patching HTTPConnection.connect, de-dupe code, raise better errors
* doc
* Add a few more tests
* Fix __call__
* Remove unused import
* Patch HTTPConnection.connect with the functionality we want, instead of duplicating code and disabling
* Fix var name
* Remove unused var
* Add changelog fragment
* Update uri docs
* Fix rebase indentation issue
* identity: Add GSSAPI suport for FreeIPA authentication
This enables the usage of GSSAPI for authentication, instead of having
to pass the username and password as part of the playbook run.
If there is GSSAPI support, this makes the password optional, and will
be able to use the KRB5_CLIENT_KTNAME or the KRB5CCNAME environment
variables; which are standard when using kerberos authentication.
Note that this depends on the urllib_gssapi library, and will only
enable this if that library is available.
* identity: Add documentation for GSSAPI authentication for FreeIPA
This documentation describes how to use GSSAPI authentication with the
IPA identity modules.
* identity: Add changelog for GSSAPI support for IPA
This adds the changelog entry for the GSSAPI authentication feature for
the IPA identity module.
* module_utils.urls: add fetch_file function
* apt: use fetch_file instead of own download()
* unarchive: use fetch_file instead of own codecopy
* apt: add test for deb=http://…
* unarchive: add test for a remote file download and unarchive
* yum: replace fetch_rpm_from_url by fetch_file
* use NamedTemporaryFile
* don't add a dot to fileext, it's already there
If user exports https_proxy without scheme, then generic_urlparse
fails to parse SCHEME and HOSTNAME from https_proxy variable.
This fix raises ProxyError is generic_urlparse API fails to parse
either of them.
Fixes: #25421
Signed-off-by: Abhijeet Kasurde <akasurde@redhat.com>
* Adds requests.Session like class
* py2 syntax fix
* Add a few examples to the Request docstrings
* Add helper methods and docs
* Fix test failures
* Switch tests to test Request instead of open_url, add simple open_url test to validate funcitonality
* Fix filename in replace-urlopen code smell test
When running the test test/units/module_utils/urls/test_open_url.py
test_open_url_no_validate_certs, the test fails because of the SSLv2
check.
Test is run on a machine using openssl 1.1.0g. By reading the openssl
man page[1], one can see that support for SSLv2 has been removed.
> Support for SSLv2 and the corresponding SSLv2_method(),
> SSLv2_server_method() and SSLv2_client_method() functions where removed
> in OpenSSL 1.1.0.
>
> SSLv23_method(), SSLv23_server_method() and SSLv23_client_method() were
> deprecated and the preferred TLS_method(), TLS_server_method() and
> TLS_client_method() functions were introduced in OpenSSL 1.1.0.
Hence this commit remove the uses of this flag when it is not defined.
[1] https://www.openssl.org/docs/man1.1.0/ssl/SSLv23_method.html
* Handle duplicate headers, and make it easier for users to use cookies, by providing a pre-built string
* Ensure proper cookie ordering, make key plural
* Add note about cookie sort order
* Add tests for duplicate headers and cookies_string
* Extend tests, normalize headers between py2 and py3
* Add some notes in test code
* Don't use AttributeError, use six.PY3. Use better names.
* Start of tests for ansible.module_utils.urls
* Start adding file for generic functions throughout urls
* Add tests for maybe_add_ssl_handler
* Remove commented out line
* Improve coverage of maybe_add_ssl_handler, test basic_auth_header
* Start tests for open_url
* pep8 and ignore urlopen in test_url_open.py tests
* Extend auth tests, add test for validate_certs=False
* Finish tests for open_url
* Add tests for fetch_url
* Add fetch_url tests to replace-urlopen ignore
* dummy instead of _
* Add BadStatusLine test
* Reorganize/rename tests
* Add tests for RedirectHandlerFactory
* Add POST test to confirm behavior is to convert to GET
* Update tests to handle recent changes to RedirectHandlerFactory
* Special test, just to confirm that aliasing http_error_308 to http_error_307 does not cause issues with urllib2 type redirects
* Required changes to support redirects on HTTP 307/308
This ensures HTTP 307 and 308 will redirect the request to the new
location without modification.
* Fix the unused newheaders reference
* Be more compliant
* Add integration tests for follow_redirects=all
* Improve other tests for new behaviour
* Make follow_redirects values more strict
This is required if we want to ensure that #36809 doesn't cause any
important behavioral changes.
This PR changes the uri module to support follow_redirects=urllib2
It also adds a better error message when the connection closes before
any data was returned.
* allow shells to have per host options, remote_tmp
added language to shell
removed module lang setting from general as plugins have it now
use get to avoid bad powershell plugin
more resilient tmp discovery, fall back to `pwd`
add shell to docs
fixed options for when frags are only options
added shell set ops in t_e and fixed option frags
normalize tmp dir usag4e
- pass tmpdir/tmp/temp options as env var to commands, making it default for tempfile
- adjusted ansiballz tmpdir
- default local tempfile usage to the configured local tmp
- set env temp in action
add options to powershell
shift temporary to internal envvar/params
ensure tempdir is set if we pass var
ensure basic and url use expected tempdir
ensure localhost uses local tmp
give /var/tmp priority, less perms issues
more consistent tempfile mgmt for ansiballz
made async_dir configurable
better action handling, allow for finally rm tmp
fixed tmp issue and no more tempdir in ballz
hostvarize world readable and admin users
always set shell tempdir
added comment to discourage use of exception/flow control
* Mostly revert expand_user as it's not quite working.
This was an additional feature anyhow.
Kept the use of pwd as a fallback but moved it to a second ssh
connection. This is not optimal but getting that to work in a single
ssh connection was part of the problem holding this up.
(cherry picked from commit 395b714120522f15e4c90a346f5e8e8d79213aca)
* fixed script and other action plugins
ensure tmpdir deletion
allow for connections that don't support new options (legacy, 3rd party)
fixed tests
We do not go through the effort of finding the right PROTOCOL setting if
we have SSLContext in the stdlib. So we do not want to hit the code
that uses PROTOCOL to set the urllib3-provided ssl context when
SSLContext is available. Also, the urllib3 implementation appears to
have a bug in some recent versions. Preferring the stdlib version will
work around that for those with Python-2.7.9+ as well.
Fixes#26235Fixes#25402Fixes#31998
* module_utils.urls - Encode the proxy connect as binary
Under Python3 the sendall method expects binary not a string.
Prior to this change the below exception was being thrown;
Traceback (most recent call last):
File "/tmp/ansible_umxox7_x/ansible_modlib.zip/ansible/module_utils/urls.py", line 1044, in fetch_url
client_key=client_key, cookies=cookies)
File "/tmp/ansible_umxox7_x/ansible_modlib.zip/ansible/module_utils/urls.py", line 951, in open_url
r = urllib_request.urlopen(*urlopen_args)
File "/opt/blue-python/3.6/lib/python3.6/urllib/request.py", line 223, in urlopen
return opener.open(url, data, timeout)
File "/opt/blue-python/3.6/lib/python3.6/urllib/request.py", line 524, in open
req = meth(req)
File "/tmp/ansible_umxox7_x/ansible_modlib.zip/ansible/module_utils/urls.py", line 729, in http_request
s.sendall((self.CONNECT_COMMAND % (self.hostname, self.port)).decode())
AttributeError: 'str' object has no attribute 'decode'
Encoding the value is inline with the lines below (Proxy-Authorization etc) which are being sent as binary.
This patch adds cookie parsing to the fetch_url/open_url module_utils
method. The overall result will still contain the key `set_cookie`, however
an additional key (`cookies`) will also be present. This new field is a
dictionary of values. Overall, this should make looking for individual
cookies in the response much easier, as currently the `set_cookie` field
is an amalgamation of the returned set-cookie headers and can be somewhat
difficult to parse.
The HTTP User-Agent "ansible-httpget" is already kind of the default,
it being the default value provided by the `url_argument_spec` helper
method. Yet, it may not be practical for all modules to get their
argument_spec that way.
Without a default User-Agent we fall back on the library
User-Agent. That being something like "Python-urllib/2.7".
While I'm no big fan of web servers making decisions based on the
provided User-Agent I still think that part of being a well-behaved
HTTP client is to provide an informative User-Agent. Not to mention
that it's a good thing for Ansible to behave consistently.
Indirectly fixes#26239
socket.create_connection is a higher-level function, which tries to
establish a socket connection using both AF_INET and AF_INET6. It got
introduced in Python 2.6, which ought to be fine with Ansible 2.4.
Fixes#26740
Based on issue 23642, add some info about the used python
executable and version to the error message when ssl connection
fail in a way that may be related to the version.
* Build HTTPSClientAuthHandler more similarly to how HTTPSHandler works
* Add docs for new client cert authentication
* Support older versions of python
* Simplify logic
* Initial support for client certs in urls.py
* Add an extra test
* Add a get_url test for client cert auth
* Add additional test for client cert auth, with validation and ssl mismatch
* Skip assert when http tester not available
* Update version_added for new options