Introducing Testing Domain - localtest.me

Save this URL, memorize it, write it on a sticky note, tweet it, tell your colleagues about it! 

localtest.me (http://localtest.me)

and

*.localtest.me (http://something.localtest.me)

If you do any testing on your local system you’ve probably created hosts file entries (c:\windows\system32\drivers\etc\hosts) for different testing domains and had them point back to 127.0.0.1.  This works great but it requires just a bit of extra effort.

This localtest.me trick is so obvious, so simple, and yet so powerful.  I wouldn’t be surprised if there are other domain names like this out there, but I haven’t run across them yet so I just ordered the domain name localtest.me which I’ll keep available for the internet community to use.

Here’s how it works. The entire domain name localtest.me—and all wildcard entries—point to 127.0.0.1.  So without any changes to your host file you can immediate start testing with a local URL.

Examples:

http://localtest.me
http://newyork.localtest.me
http://mysite.localtest.me
http://redirecttest.localtest.me
http://sub1.sub2.sub3.localtest.me

You name it, just use any *.localtest.me URL that you dream up and it will work for testing on your local system.

This was inspired by a trick that Imar Spaanjaars introduced me to. He created a loopback wildcard URL with his company domain name.  I took this one step further and ordered a domain name just for this purpose.

I would have liked to order localhost.com or localhost.me but those domain names were taken. So to help you remember, just remember that it’s ‘localtest’ and not ‘localhost’, and it’s ‘.me’ rather than ‘.com’.

I can’t track usage since the domain name resolves to 127.0.0.1 and never passes through my servers, so this is just a public tool which I’ll give to the community. I hope it gets used. And, since I can’t really use the domain name to explain itself, please spread the word and tell others about it.

Some examples on how to use it would include:

  • Creating websites on your dev machine.  site1.localtest.me, site2.localtest.me, site3.localtest.me.
  • Great for URL Rewrite (IIS) or mod_rewrite (Apache) testing: redirect.localtest.me, failuretest.localtest.me, subdomain.localtest.me, city1.localtest.me.
  • Any testing on your local system where a friendly URL would be useful.

I hope you enjoy!

46 Comments

  • Here's two other options: lvh.me and vcap.me.

  • "I would have liked to order localhost.com or localhost.me ..."

    I'd never have guessed, given the typo in your first example URL! :o)

  • What's wrong with just "localhost"? Doesn't that do the same thing by default?

  • @Scott. Thanks. I figured there had to be others out there.

    @RichardD. :) Good one. Thanks for the catch, I just corrected that.

  • @mxmissile. That works great for a single site, but when you get into more sites, or into URL Rewrite testing which has redirects or that needs to mimic other domain name patterns, then it's helpful to have an unlimited range of 3rd level and deeper domain names as your disposal.

  • nice one, cheers, i think it may be a little difficult to get an ssl certificate for this domain ;-)

  • @Grahame. Good question. Since this would need to be installed on your system it will take some extra work for each person using it. You can use a self-signed cert, but are you thinking of a certificate that could be downloaded and installed so that you have full SSL support without a warning?

  • I think it poses a great security risk to point to a 3rd party domain: there's no guarantee the web traffic sent there is not captured.

    I'm not sure what the problem with using 127.0.0.1 instead of this potentially risky approach is.

  • @Tom. You're safe for a couple reasons. If you ping it first and it goes to 127.0.0.1 then traffic never leaves your computer. It's not possible to track the traffic outside of your computer.

    Additionally, let's say that I (or anyone else doing something like this) wanted to proxy the traffic for tracking purposes. It wouldn't be possible to call back to '127.0.0.1' from my servers (since my servers have no access to your local web server), making it impossible to proxy the traffic without sending you to a different page. As long as you test using the URL and it shows a webpage from your computer then you can be confident that it went directly there without going anywhere in-between.

    In other words, I trust the lvh.me and vcap.me mentioned above (which I know nothing about the person setting it up) just as much as the domain that I setup since a ping returns 127.0.0.1 and I'm confident that traffic never leaves my computer.

    In regards to just using 127.0.0.1, this isn't for everyone. If you have a single site that you test with on your local computer then using 127.0.0.1 or localhost directly is perfectly acceptable. Localtest.me is for situations where you manage multiple sites or you test more complex domain patterns.

  • Some comments mentioned in twitter which I will reply to here. arp poisons aren't possible since traffic never leaves the local computer. dns poisons aren't any more possible than it would be for a domain like google.com. If your network's dns has been poisoned then you have more things to worry about than your test site.

    Someone squatting a lapsed expiry does mean that a malicious person can gain access to the domain name, but again, if you visit mysite1.localtest.me and it shows your website then you can rest assured that traffic never left your computer. See my comment above about the proxy issue, making it impossible to have traffic go to the public internet and back to your private network again. It may be possible to create a redirect solution (not using 127.0.0.1 for the DNS record of course) but it couldn't pass POST data around without your browser warning you.

    This is a DNS solution, so as long as DNS points to 127.0.0.1 then at the web layer traffic will never leave the local computer.

    The only concern I would have with using something like this from someone I didn't trust is if they set the DNS record to 127.0.0.1 long enough that I trusted them, and then they changed the DNS to something else. But in that situation it would become immediately obvious since it shouldn't show my testing website anymore.

    No one has to use this. If you work for NASA and you want to take every precaution, then you can continue to use the hosts file or your own DNS zone. This is just a quick handy way to do testing on your local system without needing to mess with your hosts file.

  • This also comes in quite handy when you have a web project that access a WCF or other service based site on the same machine. Rather than messing with config settings, port number of virtual directories, you can simply use:

    http://website.localtest.me
    http://services.localtest.me

    And have website reference the services project. When you're ready to go live, just change the service domain name in the config file (or use Web Config Transformations) and you're ready to go./

    Thanks, Scott, for making this way more public than I did. And BTW: it was Nicolai who initially taught me this trick.

    Cheers,

    Imar

  • If you want to avoid traffic leaving your computer, unplug your computer from the internet.

  • @Imar. Good examples. I'll need to tell Nicolai thanks too for the idea.

    @Anon. That would do the trick. :)

  • I do not know - yet - to what I should use it but for those have concerns about security...

    On a Windows box go to the command prompt and enter

    tracert localtest.me

    You can see that the traffic never goes nowhere out of your computer. (The only thing goes out is the DNS resolving routine, but that is completely different)

    Scott - Good one!

  • Great idea,
    Thank you!

  • I don't trust you, so I'll never use your domain. That said, I have used the same approach with domains I control, and it works well, so really this should be considered a live example of what's possible, and only when you trust the domain owner.

    Here's a situation you haven't thought of and why I wouldn't advocate using your or another such public untrusted domain to another person. Someone may build their site to use persistent cookies - if they sign in to their dev site, then you change the domains to point to your servers, the next time they go to 'their' site, their persistent cookies will be sent to your servers. Without ever knowing exactly how securely they've implemented using cookies and what information they store in them, there's a chance they can leak sensitive data to you.

  • The hosts file will still work better for SSL/TLS.
    I just borrow the real certificate from a test box and override the test server IP in my hosts file.
    SSL with a real certificate vs. self-signed certificate!

    On windows(and probably others), you can create loopback network interfaces nailed to a specific IP. This is good for supporting multiple, simulatneous SSL connections on different apparent hosts.

    Again add a hosts entry for these loopback IPs.

    The DNS approach will definitely work better in a collaborative envirnonment. It IS a pain to configure hosts files. "It works on my dev box..."

  • @Peter, good idea. Nice way to show where the traffic is going.

    @Alan. Yes, you're right that that is a possible way to exploit it. It's unlikely but it's possible. Unlikely because someone owning a domain like this (me in this case, but as the comments have shown, there are others too) would need to build up a trust using the domain, switch it back to my own servers, wait for you to test your site again (that could be days, weeks, etc) and then you would immediate realize that it's the wrong site. So I would have someone's dev cookies, but because it's obvious the domain would immediately lose credibility and not be used again.

    But there is that possibility so if you don't know and trust the source then you may want to continue to use hosts files, create a wildcard entry for a sub-domain of a domain you already own, or buy your own domain name.

    In this case I'm doing it to help the community and not for any monetary gain, but you don't know me, so I can understand your caution.

  • Because I have an appreciation of both clever and magnanimity, I commend you for your effort.

    I already have a different way to deal with this, so I am unlikely to use it, but I still feel that it is the thought that counts, and your thought earns my thought.

    Cheers.

  • This method also doesn't deal with cases where you want to test multiple IP addresses, e.g., if you have different hosts set up on different virtual machines on your computer. Or multiple certificates. Or a bunch of other scenarios. Plus of course, if your work depends on the name "localtest.me," it can't be viewed on any other computer.

    It is nice to have multiple domain levels compared to localhost, especially since "localhost" fails most domain regexes. But if you're doing something more complicated than localhost works for, it seems like modifying /etc/hosts is a miniscule task.

  • A nice gift to all of us developers! I have had a similar setup internally, on my local dns server, that did this, but now it will work everywhere!

    Thank you!
    -bill

  • @RJ. It won't work for all situations. It's helpful for the situations where it works, but after that you'll need your own DNS or hosts file entries. I agree. Some people depend on their example domain name for URL Rewrite testing so that the rules work in production too, so they may rewire the domain IP from the hosts file, or add a loopback adapter to their machine so that they can test an example domain name. It won't work for that either.

  • @Englebart Yes, that makes sense. For SSL, if you don't want to use a self-signed cert with a warning then you'll need to use your own domain name which you have a valid cert for. (I'm partially tempted to buy a cert and make it exportable and downloadable, but if people will go through that much effort they may want to use their own domain name at that point)

  • This is a great idea. Thank you!

  • @joequincy. Thanks for mentioning. We added the AAAA record last night, but it looks like that will add confusion, so I just removed it again and added ipv6.localtest.me and *.ipv6.localtest.me. That will allow specific testing.

    Also, Tatham Oddie, Brendan Forster and Aaron Powell teamed up with me on this and have supplied a wildcard cert and a landing page. Very cool! So everyone can install the cert and do local testing with a valid HTTPS cert now too.

    There is one cname reserved for the landing page and cert download: http://readme.localtest.me/.

  • Thank you very much Scott, this is great!

  • Great idea,

    Thank you!

  • One more thing that's important to realize: although the name resolves to a local address, you still need Internet access to resolve the address against the DNS server. I ran into this when demoing some stuff for a client without WIFI access. If you're off-line, you would still need to add the host records to your hosts file....

    Imar

  • @Imar. Yeah, that could throw off a good demo. Sounds like a good thing to keep in mind.

  • I'm getting a certificate revoked warning for the supplied SSL certificate. Am I missing something or is the cert not valid any more?

  • Hi Marnix,

    I'm getting the same. It appears that the revocation list from EsentialSSL CA or COMODO is having problems. Or at least that's what it appears to me so far. We'll look into it and report back.

    Scott

  • Hi Marnix,

    Just a quick update that I haven't forgotten about this. Are you using Win 8? I am and there was an update this past weekend for the client revocation list so I'm wondering if it has to do with that. I'm still waiting back on someone responsible for the cert.

  • Hi Scott,

    I'm on Windows 7 and I'm still seeing this issue in all browsers.

  • Hi Marnix,

    Thanks for mentioning that. I'll put another burst of energy in getting this resolved. It got stuck in a thread with the certificate authority vendor but we really need this to be resolved for it to be useful to anyone.

    Scott

  • @Marnix,

    Ok, I finally found out what happened. Too bad they didn't communicate it to me earlier. The agreement says that we can't publish the private key on the public internet. So basically they say that they can't offer the certificate to us. So we'll talk to them and possibly other cert authorities to find one that will confirm that they can support this unique situation.

    Scott

  • @Marnix,

    Just a quick update on this. It doesn't look promising that we'll get a solution for this, which is a real bummer. With the private key being handed out on the internet, the cert authorities complain that it puts the cert at risk, along with their reputation. There's still a thread going in the background but it's moving slowly so at this point it doesn't look likely that we'll get a workable solution.

  • Any luck since the ssl cert issues arose?

  • Hi drewid,

    No, we have to officially say that it's not possible now. Since the cert is handed around publicly on the internet it isn't considered a safe cert and it can potentially be abused for users in countries that may override DNS like Iran, North Korea, and China. So as much as we would love to make this available, we cannot. For local testing with a cert you'll need to use a self-signed cert, a domain certificate, or purchase your own domain name and cert for your own usage. Sorry about the unfulfilled teaser!

    Of course the domain name still works for localhost testing. I use it daily myself.

  • I complete understand how this works (unlike a few others who commented) and I think this is a great idea!
    Thank you for providing this service!
    Now if I could figure out a way to make apache auto route vhosts to folders based on the subdomains.... Hum...

    Again Thanks!

  • Thanks Don. The domain is getting about 20K requests per month at the moment, so people are definitely using it. The DNS provider provides a DNS query count.

    As for the the vhosts to folders, I could answer that for IIS, but sorry to say I'm no help for how to do that for Apache.

  • For Apache I added this


    VirtualDocumentRoot /Applications/MAMP/htdocs/%1/
    ServerName localtest.me
    ServerAlias *.me
    UseCanonicalName Off

    Options Indexes FollowSymLinks MultiViews
    AllowOverride All
    Order allow,deny
    Allow from all



    Under the normal setup

    NameVirtualHost *


    DocumentRoot /Applications/MAMP/htdocs


    (add here)

    This will accept for example "something.localtest.me" and automatically look in the /Applications/MAMP/htdocs/something/ folder.

  • This simply doesn't work.
    I just pinged localtest.me and it didn't resolve. Am I missing something here?

  • @Don, thanks for posting. This should come in handy for the Apache folk!

  • @jaffa. I can't imagine why it wouldn't work. if you ping google.com, does that resolve right now? Best I could guess is that you had a temporary DNS issue with your DNS server. nslookup will give some more details on dns related issues. For example, a simple place to start is from the command line enter "nslookup localtest.me" and see what you get.

  • This is a great idea.

    I read through most of the comments and seriously people, give the man some credit.

    I'm using it because windows doesn't support wildcard DNS on hosts files.

    I have a dev site that works like customer.localhost.me so setting up lots of customer entries in hosts file is a pain.

    I did however add localhost.me and www.localhost.me in my hosts file, this enables me to control the main entry points and work offline.

    People could override readme with this method also if they wish

    127.0.0.1 localtest.me www.localtest.me

  • Thanks! This is quite useful if you need to test OAuth based authentication workflows, most major social networks won't accept localhost as domain (namely facebook and google).

Comments have been disabled for this content.