From Fedora Project Wiki
No edit summary |
No edit summary |
||
Line 3: | Line 3: | ||
|actions= | |actions= | ||
It would be great if you could try connecting to sites that you normally connect to, but I understand if you have privacy concerns. | It would be great if you could try connecting to sites that you normally connect to, but I understand if you have privacy concerns. | ||
# For some optional steps, <code>sqlite</code> package might be needed | |||
#:<pre>dnf install -y sqlite</pre> | |||
# Let's start with some well known sites | # Let's start with some well known sites | ||
#:<pre> | #:<pre> | ||
Line 38: | Line 40: | ||
#::done</pre> | #::done</pre> | ||
|results= | |results= | ||
Connecting without problems ideally | |||
}} | }} |
Latest revision as of 06:54, 30 March 2017
Description
Connecting with FUTURE profile to common websites
How to test
It would be great if you could try connecting to sites that you normally connect to, but I understand if you have privacy concerns.
- For some optional steps,
sqlite
package might be neededdnf install -y sqlite
- Let's start with some well known sites
- echo google.com youtube.com facebook.com wikipedia.org yahoo.com amazon.com live.com vk.com twitter.com instagram.com reddit.com linkedin.com |tr " " "\n" >>sites.txt
- Firefox - get really visited sites
- Export https sites from history
- for f in $(find ~/.mozilla/firefox/ -name places.sqlite); do
- sqlite3 $f 'select distinct substr(replace(url, "https://", ""), 0, instr(replace(url, "https://", ""), "/")) from moz_places where url like "https://%";' >>sites.txt
- done
- Alternatively/additionally, get https sites from bookmarks (Bookmarks -> Show All Bookmarks -> Import and Backup -> Export Bookmarks to HTML)
- cat bookmarks.html |grep -io 'href="https://[^ ]*' |cut -d\" -f2 |sed 's|https://\([^/]*\).*|\1|' >>sites.txt
- Export https sites from history
- Chrome - get really visited sites
- Export https sites from history
- for f in $(find ~/.config/ -name History); do
- cp -f $f ./tmp.db && \
- sqlite3 tmp.db 'select distinct substr(replace(url, "https://", ""), 0, instr(replace(url, "https://", ""), "/")) from urls where url like "https://%";' >>sites.txt
- rm -f tmp.db
- done
- Alternatively/additionally, get https sites from bookmarks (Bookmarks -> Bookmark manager -> Organize -> Export bookmarks to HTML file...)
- cat bookmarks.html |grep -io 'href="https://[^ ]*' |cut -d\" -f2 |sed 's|https://\([^/]*\).*|\1|' >>sites.txt
- Export https sites from history
- Filter possible duplicates
- cat sites.txt |sort |uniq >sites.new; mv -f sites.new sites.txt
- Try connecting to these sites with FUTURE profile
- update-crypto-policies --set FUTURE
- for site in $(cat sites.txt); do
- wget -q -O /dev/null https://$site || echo "FAIL wget $site"
- curl -s https://$site >/dev/null || echo "FAIL curl $site"
- (sleep 5; echo -e "GET / HTTP/1.1\n\n") |openssl s_client -connect ${site}:443 -servername $site &>/dev/null || echo "FAIL s_client $site"
- done
Expected Results
Connecting without problems ideally