Why do some websites have language specific URLs, when the users still use the .com version?

I’m creating my first multilingual website and I have been looking at examples from other websites. So far I’ve noticed that many popular websites have a .com version and a kr.example.com or example.co.kr version. I know for a fact that people in that country (in my example it’s Korea) still use the regular .com version.

What’s the point of having a language-specific URL if the website automatically determines the language and allows a language selection if they determine incorrectly?

Nginx Clean URLs don’t work

I tried this: How do I enable clean URLs with Nginx?

And I tried this: https://www.digitalocean.com/community/questions/how-do-i-enable-clean-urls-with-nginx-for-drupal

So, I tried these:

 location / {     index index.php;     # This is cool because no php is touched for static content     try_files $  uri $  uri/ @rewrite;     expires max;   }    location @rewrite {     # Some modules enforce no slash (/) at the end of the URL     # Else this rewrite block wouldn't be needed (GlobalRedirect)     rewrite ^/(.*)$   /index.php?q=$  1;   }    location ~ \.php$   {     include fastcgi_params;     fastcgi_param SCRIPT_FILENAME /srv/www/www.example.com/public_html$  fastcgi_script_name;     fastcgi_intercept_errors on;     fastcgi_pass unix:/var/run/php-fpm.sock; # fastcgi_pass unix:/tmp/php5-fpm.sock;  } 

And I tried these:

server {      listen       80;      server_name  example.org;       location / {          root   /path/to/drupal;          index  index.php;          error_page 404 = @drupal;      }       location @drupal {          rewrite ^(.*)$   /index.php?q=$  1 last;      }  } 

And clean URL’s don’t work. The site runs at ip address in a subdir 127.0.0.1/dev

It has its own config at /etc/nginx/sites-available/dev and symlinked to /etc/nginx/sites-enabled/dev

This is my current sites-available/default config:

 root /var/www;   # Add index.php to the list if you are using PHP  index index.html index.htm index.php;   server_name _;   location / {  # First attempt to serve request as file, then  # as directory, then fall back to displaying a 404.  # try_files $  uri $  uri/ =404;  try_files $  uri $  uri/ @rewrite;  expires max;  }   location @rewrite {    # Some modules enforce no slash (/) at the end of the URL    # Else this rewrite block wouldn't be needed (GlobalRedirect)    rewrite ^/(.*)$   /index.php?q=$  1;  }   # pass the PHP scripts to FastCGI server listening on 127.0.0.1:9000  #  location ~ \.php$   {  include snippets/fastcgi-php.conf;   # With php7.0-cgi alone:  # fastcgi_pass 127.0.0.1:9000;  # With php7.0-fpm:   fastcgi_pass unix:/run/php/php7.0-fpm.sock;   }   # deny access to .htaccess files, if Apache's document root  # concurs with nginx's one  #  location ~ /\.ht {   deny all;  } } 

This is my current sites-available/dev config:

server {          server_name dev;         root /var/www/dev; ## <-- Your only path reference.          # Enable compression, this will help if you have for instance advagg module         # by serving Gzip versions of the files.         gzip_static on;          location = /favicon.ico {                 log_not_found off;                 access_log off;         }          location = /robots.txt {                 allow all;                 log_not_found off;                 access_log off;         }          # This matters if you use drush prior to 5.x         # After 5.x backups are stored outside the Drupal install.         #location = /backup {         #        deny all;         #}          # Very rarely should these ever be accessed outside of your lan         location ~* \.(txt|log)$   {                 allow 192.168.0.0/16;                 deny all;         }          location ~ \..*/.*\.php$   {                 return 403;         }          # No no for private         location ~ ^/sites/.*/private/ {                 return 403;         }          # Block access to "hidden" files and directories whose names begin with a         # period. This includes directories used by version control systems such         # as Subversion or Git to store control files.         location ~ (^|/)\. {                 return 403;         }    location / {     index index.php;     # This is cool because no php is touched for static content     try_files $  uri $  uri/ @rewrite;     expires max;   }    location @rewrite {     # Some modules enforce no slash (/) at the end of the URL     # Else this rewrite block wouldn&#39;t be needed (GlobalRedirect)     rewrite ^/(.*)$   /index.php?q=$  1;   }    location ~ \.php$   {     include fastcgi_params;     fastcgi_split_path_info ^(.+\.php)(/.+)$  ;     fastcgi_param SCRIPT_FILENAME $  request_filename;     fastcgi_intercept_errors on;     fastcgi_pass unix:/run/php/php7.0-fpm.sock;  }          # Fighting with Styles? This little gem is amazing.         # This is for D6         #location ~ ^/sites/.*/files/imagecache/ {         # This is for D7 and D8         location ~ ^/sites/.*/files/styles/ {                 try_files $  uri @rewrite;         }          location ~* \.(js|css|png|jpg|jpeg|gif|ico)$   {                 expires max;                 log_not_found off;         }  } 

I restart nginx server by running sudo service nginx restart

nginx: Redirects URL’s to extensionless pages not working on home page

I’m trying to automatically redirect pages with a PHP extension to a non extension page version e.g example.com/page.php to example.com/page. I’ve managed to get this working for all pages except my home page and a dynamic page. These pages both return a 404 error page.

For the home page if I try to visit example.com I get a 404 response, but if I visit example.com/index then the home page loads.

I obviously want the home page to load like example.com and the dynamic page should load like example.com/check/something.com.

Here’s the relevant code:

 server {     index  index.php index.html index.htm;     location / {         try_files  $  uri /$  uri /index.php?$  args;         sendfile   off;          # returns a 404 e.g example.com/check/something.com         rewrite ^/check/(.*)$   /check.php?url=$  1&name=&submitBtn=check;    }      # If PHP extension then 301 redirect to semantic URL     location ~ ^(.*)\.php$   {         return 301  $  scheme://$  server_name$  1$  is_args$  args;     }      location ~ ^/(.*) {        try_files $  uri.php @static; # If static, serve it in @static         fastcgi_pass unix:/var/run/php-fpm/php-fpm.sock;         fastcgi_index index.php;        fastcgi_param SCRIPT_FILENAME $  document_root/$  1.php;        fastcgi_param SCRIPT_NAME $  fastcgi_script_name;         include fastcgi_params;     }       location @static {         try_files $  uri =404;     }    }   

Thanks đŸ™‚

How to import data from multiple urls using feeds importer?

I have two api urls. One returns all the products with product ids and the 2nd url return detail about a single product using the product id.

So my requirements are.

  1. Request the first url, Get the xml which contains all products with product ids.

  2. Then loop over the list of products, get the product id and request the 2nd url, get the xml data which contains the product details.

  3. And import all the products in the product content type.

Is this possible using the feeds importer? Because I can’t figured out how to do this in the feeds importer. OR I have to write a custom fetcher or processor etc?

SEO URLs in custom module

I have a module built in magento 2.1.16 which store details of contacts in database and fetches them on list page and details view page. The contacts are of 3 types

  1. Branches (https://roadmaster.com.co/en_sa/contacts)
  2. Dealers (https://roadmaster.com.co/en_sa/dealers)
  3. Authorized Centers (https://roadmaster.com.co/en_sa/servicecenters)

When i clik on more information on any I get url like this one https://roadmaster.com.co/en_sa/dealers/view/index/id/24/ While i want something like https://roadmaster.com.co/en_sa/dealers/al-khaleej

or for this url https://roadmaster.com.co/en_sa/contacts/view/index/id/1/ I want something like https://roadmaster.com.co/en_sa/contacts/rm-jeddah

The contacts are stored in single table and are distinguished by contact_type attribute.

Redirecting old file URLs (no pattern)

running Drupal 7.65.

Our previous website manager moved us from Drupal 6 to Drupal 7 and cleaned up the file structure on our server.

While all the files that are now on our website have clean and organised URLs, and all general URLs on our website are now clean, Google still has indexed our old URLs.

e.g.:

old: domain.com/program-name

new: domain.com/programname

and

old: domain.com/sites/default/files/documents/owl-final-final3.pdf

new: domain.com/sites/default/files/folder1/folder2/owlguide.pdf

So if someone googles ‘program name’ or googles for our ‘owl guide’, the Google result takes them to the old URLs (broken links) rather than the new URLs.

Is there a way I can redirect these broken links to e.g. a search page on our website, rather than the “404 page not found” page?

For some of the old URLs to our pages (e.g. /program-name rather than /programname) I’ve been able to set up URL aliases to redirect people but don’t know how I can do so for the files when there are so many that I wouldn’t be able to work out.

Sorry if I haven’t used the right words, I’m not a web developer! I looked into Pathologic but don’t think it addresses the problem we’re having…

Magento CE 2.3.1 – uptimerobot.com causes 404 errors on configured URLs

I have configured Homepage, Contact us, etc. to be monitored by uptimerobot.com.
But randomly it causes 404 error. And after clearing the cache it fixes the issue (but reappears later).
Note that 404 happens to only those pages which are configured to be tracked by uptimerobot.com

FYI, I am using

  • Magento CE 2.3.1
  • Redis
  • FPC (Filesystem)

I believe it has something to with the request headers sent uptimerobot in order to access those pages.
Have anyone faces such issue with some kind of crawlers/bots?

[1LINKLIST.COM] – ★MILLIONS of fresh urls, EVERY DAY★ – Guides, Tutorials, Auto-Sync, and More!

image
image
image

FAQ:

Contact Information:

Email: support@1linklist.com

Skype General Support: Sagarspatil
Skype Billing Technical/Support: Epic1Links

Refund Policy:

There isnt one. Once you download your first list, no money back; sorry! As this is an service/digital product, be aware that paypal customer protection does NOT cover you. (So no scammy hit-and-runs, sorry guys)

 

Word Of Warning To Leakers:

Every. Single. List. Has a unique url embedded in it (In random places). This url is unique to your account. If we find a list floating around, it takes moments for us to identify the culprit, and ban them.

As we fill up our member-slots we will be adding additional security measures, to ensure we have a the most secure members area we can provide.

If you have any additional questions, dont hesitate to ask!

Review Copies, Etc:

We have 20 beta-tester here on GSA forums, and we’ve asked all of them to drop us a line! Keep an eye out and see what they have to say.