Proactively scan your website pages for mixed content issues using the desktop app.
Installation & Use
Other Platforms Available Here (Windows, Mac & Ubuntu Linux)
Scan Mode
Check for all issues or filter to just passive and active mixed content issues, or even to show Chrome's "Not Secure" warnings on insecure forms.
User Agent Switcher
Run as the declared HTTPS Checker spider or as different browsers for testing mobile versions of sites.
Issues Cap
Limit the number of issues it finds to make it easier to fix as you go along and for exporting to PDF & CSV's
Use Robots.txt File Rules
Follow or ignore the rules set in a site's robots.txt file.
Additional Robots.txt-style Rules
Enter additional URL's or patterns you wish to ignore.
Page Cap
Set a limiton the number of pages crawled. Use this with different starting URL's to crawl large sites in a more manageable way.
Logged In Session Crawling
Crawl pages that require a login to access, or Admin areas. IMPORTANT: If pages have links which trigger actions (e.g. Add, Delete, Update) only run on a test version of your site and ensure you have a proven backup to revert to.
Proxy Support
Supports HTTP, HTTPS, Socks, & Proxy auto-config (PAC). Useful for testing development sites behind a proxy, or for seeing all requests the app makes in order to crawl your site.
Other Platforms Available Here (Windows, Mac & Ubuntu Linux)
HTTPS Reporter
After completing your scans and fixing the issues, you could then use HTTPS Reporter to catch any remaining or new issues as they arise in real-time.
What Does The App Check?
Will HTTPS Checker Work On My Machine?
The downloadable app works on 64-bit machines running a minimum of Mac OSX 10.8 or Windows 7. Most machines purchased in the last 3-5 years will run 64-bit. If in doubt, simply try the free version.
Can I Trust This App Is Virus Free?
Yes, Wildfire Internet is an established UK business trading since 2008 who are enrolled with Appleās Developer Program and use authenticated Digicert code signing certificates.
Can I Crawl ANY Website?
The app is normally able to crawl websites that adhere to good search engine spidering practices. Please not that some sites block spiders and therefore prevent the app from working. Tip: You can switch the User Agent to a desktop or mobile browser and try again. Robots.txt files can also be set to block access to certain parts of a website, however you can choose to ignore this in Advanced Settings to make sure you reach all pages.
How Many Pages Do You Crawl?
This depends on the licence you purchase, we start from 500 pages for free and goes up to 250,000 pages in the downloadable app. We might find multiple issues on a page so we put a cap on issues found. There is a default limit of 50,000 issues, at which point we stop the crawl so you can investigate and resolve these issues then re-run the report afterwards. You can change this to be more or less issues via advanced settings. If you have more than 250,000 pages, please contact us to discuss how we can meet your requirements on 0845 643 1290
Can I Crawl ALL Content On A Website?
There is no absolute guarantee that we can crawl everything as there may be issues which prevent us accessing pages, such as timeout issues or resources deeply embedded in code that changes content, such as heavy AJAX use. We recommend using HTTPS Reporter to collect all violations after running HTTPS Checker, as this will mop up any areas the app hasn't been able to reach.
How Long Does A Crawl Take?
That depends on how many pages you have, the spec of your local machine and internet connection speed. Our tests allow us to crawl around 5,000 pages per hour. We display a count of the number of pages that have been crawled and you can stop the crawl at any time.
Can I Speed Up A Crawl?
Aside from running on faster hardware, there is an option to only run "Mixed Content Checks" which will help.
Can I Run The App On Certain Parts Of The Website & Not Others?
Yes, you can enter a URL as a starting point for the crawl e.g. https://www.mydomain.com/blog. You can also use "Additional Robots.txt Style Rules" to disallow crawling on certain parts of the site using standard robots.txt syntax.
Can I Scan Pages You Have To Be Logged In To See, Including Admin Areas?
Yes, using "Logged In Session Crawling" and following the instructions. IMPORTANT: Allowing a spider to crawl links in an admin page means they will perform whatever action is associated with that link, e.g. Change Password, Add User, Delete Account etc. Before you run this, please therefore ensure you have a proven backup of your site or run it on a separate test version, and have entered the rules properly in the "Additional Robots.txt Style Rules" field to prevent data being deleted or modified. NOTE: We are not liable for any damage caused by running this program.
What's The Best Way To Use This On A Large Site?
The app has been proven to perform a continuous crawl of 250,000 pages on an individual site, however the constraint you face is when there are many tens or even hundreds of thousands of issues thrown up. These are stored in local machine memory (so you have a physical limitiation potentially) plus the report file would be so large you may not be able to open it. This is why there is an Advanced Setting to limit the number of issues found. Large sites, i.e. 100,000's of pages, are typically split into sections such as: /blog, /news, /products etc. It is recommended you run the app on each section in turn, by entering a starting URL for each section e.g. https://www.mydomain.com/blog, and set realistic "Issues Cap" and "Page Cap" limits in Advanced Settings. The home page could then just be run as a single page.
How Often Should I Run HTTPS Checker?
Initially run it as part of your pre-migration process when moving from HTTP to HTTPS to establish what needs to be amended. Once you areon HTTPS, run it to make sure everything is fine. We recommend running this regularly as part of your web security administration processes to provide peace of mind that everything remains good, for example after you have added new content, imports, design changes etc. Alternatively, use HTTPS Reporter to catch CSP violations in real-time.
Useful References