If you have curl, you could try this:
#!/bin/bash
set -o posix
TIMEFORMAT=%R
time curl -# \
--connect-timeout 4 --retry 2 \
-H "User-Agent: Mozilla/5.0 (Windows NT 10.0; Win64; x64; rv:78.0) Gecko/20100101 Firefox/78.0" \
-H "Accept-Charset: UTF-8" \
-H "Accept-Encoding: gzip, deflate" \
-H "Accept: */*" \
-H "Accept-Language: en" \
-H "Cache-Control: 0" \
-H "Connection: keep-alive" \
-w "# tcp:\t\t%{time_connect}\n# tries:\t%{num_connects}\n# dns:\t\t%{time_namelookup}\n# ssl:\t\t%{time_appconnect}\n# size_dl:\t%{size_download}\n# speed_dl:\t%{speed_download}\n# url:\t\t%{url_effective}\n# total:\t%{time_total}\n# Real Time:\t" \
-o /dev/null https://news.ycombinator.com/robots.txt
The reason I suggest using robots.txt is so that multiple people can run that and compare notes to something constant.Here is my output:
./test.sh
######################################################################## 100.0%
# tcp: 0.066
# tries: 1
# dns: 0.004
# ssl: 0.484
# size_dl: 158
# speed_dl: 285.000
# url: https://news.ycombinator.com/robots.txt
# total: 0.554
# Real Time: 0.569