Helping search engines. WWW to non-WWW redirect with Nginx and urls.
Optimizing urls
Slash makes like much easier for the search engines and your own webserver.
First of all, APPEND_SLASH
should not be False
in your settings.py
. By default it is True, but in case you set it to False, undo it.
Secondly, in your urls.py
files and in robots.txt
use slash as well. I find using an html template a better solution than making a view
. So, my favorite way to make a robots.txt
in Django can be this:
1. templates/general/robotstxt.html
User-agent:*
Disallow: /admin/
Host: https://lovelywebsite.com
sitemap: https://lovelywebsite.com/sitemap.xml
Clean-param: ref /home/
Clean-param: ref /home/pets/
Clean-param: ref /home/pets/my_love/
Clean-param: ref /home/pets/*
2. urls.py
urlpatterns = {
path('robots.txt', TemplateView.as_view(template_name='general/robotstxt.html', content_type='text/plain'), name='robots'),
}
Third, organize your patterns’ order the way that it is easy to find essential pages (listing with SEO text etc.). For example, this is not effective:
urlpatterns = {
path('pets/<slug:slug>/', core_views.DogView.as_view(), name='dog'),
path('pets/my_love/', core_views.MyLoveView.as_view(), name='my-love'),
}
Because if you need to access /pets/my_love
the webserver will first reach to the first pattern and then move to the second one. It is not much of a hassle, but does not look too good either way.
WWW to non-WWW redirect with NGINX with https
This tutorial worked like a charm, so I’d like to share a piece that solved my problem in a second. Add this to your .conf file in /etc/nginx/sites-enabled/
:
server {
listen 80;
access_log off;
error_log off;
server_name www.lovelywebsite.com;
return 301 https://lovelywebsite.com;
}
Then reload:
sudo systemctl reload nginx