I'm trying to fetch a nonexistent (unresolvable hostname) page using wget. I expect it to fail, but it does not.
Here is a transcript
[mark@cn ~]$ cat /etc/resolv.conf
; google public
nameserver 8.8.8.8
nameserver 8.8.4.4
[mark@cn ~]$ host nonexistent.example.com
Host nonexistent.example.com not found: 3(NXDOMAIN)
[mark@cn ~]$ wget -O - http://nonexistent.example.com/
--2010-09-05 22:12:09-- http://nonexistent.example.com/
Resolving nonexistent.example.com... 205.178.189.131
Connecting to nonexistent.example.com|205.178.189.131|:80... connected.
HTTP request sent, awaiting response... 301 Moved Permanently
Location: http://127.0.0.1 [following]
--2010-09-05 22:12:09-- http://127.0.0.1/
Connecting to 127.0.0.1:80... connected.
HTTP request sent, awaiting response... 200 OK
Length: 524 [text/html]
Saving to: `STDOUT'
0% [ ] 0 --.-K/s
(some HTML that my local Apache serves)
100%[======================================>] 524 --.-K/s in 0s
2010-09-05 22:12:09 (62.5 MB/s) - `-' saved [524/524]
Why is this happening? Any ideas?
OS: Centos 5.5 x86_64
Network: cloudnext dedicated virtual servers
I'm asking because I've tried the same in Python code and something similar is happening. Something fishy is going on and I can't quite figure out what.
Answer
Did you maybe leave out the part of your resolv.conf
where your search domain(s) are listed?
If at least one of your search domains has wildcard entries (or your server FQDN domain), then what wget
really resolves, is nonexistent.example.com.your.domain.com.
. This probably leads to a web server that is configured to redirect the client to localhost, if it gets a query for an unknown VHost.
The proper way to fix this in my opinion would be, to not use wildcard domains or at least not to use these as search domains. If infact your server's FQDN is in a wildcard domain, you could work around the problem by putting this in your resolv.conf
:
options ndots:1
search .
No comments:
Post a Comment