CNAME works for nslookup but not curl

I have created two CNAMEs for my (CloudFront) domains, like this:

enter image description here

nslookup works, giving me the correct values for both www and d-test — but when I actually get the data, using curl or a browser, all data is retrieved from the origin pointed to by www, regardless of which URL I actually use. If I use the origin URLs, it works fine.

How is this even possible?

Burp Suite can not intercept the wget and curl HTTP request

I use Burp Suite as proxy listen, and I also set the HTTP Proxy as

now Burp Suite can intercept all the browsers(eg. firefox, safari, chrome), and application(eg. dictionary ) on my Mac: enter image description here

but can not intercept the wget and curl‘s request.

such as:


Isn’t curl and wget using HTTP protocol requests?


  1. Why I set macOS preferences HTTP Proxy to, all the browsers and applications will use this Proxy by default? I did not set in each browser.

  2. Why curl and wget do not use the proxy by default? even I set –proxy still not work.

wget --proxy 

curl to wfuzz translation

I am trying to run wfuzz to match the curl command which works, I know valid credentials but it doesn’t seem to pass it properly.

wfuzz -c -w user -w pass -b "session=cookie" --digest FUZZ:FUZ2Z ""

(user and pass files contain user and pass accordingly)

curl -c cookie --digest -u user:pass

The target is running Gunicorn web server

curl query to regular url: basic auth

I’m currently experimenting with the Toggl API

For example, the page states

curl -v -u 1971800d4d82861d8f2c1651fea4d212:api_token -X GET 

If i enter this in my console (using my api token, not the example token of course) it works and I’m getting back the requested JSON.

If I try to get the json however directly in a browser by restatting the query as 

I’m getting

Access to was denied You don't have authorization to view this page. HTTP ERROR 403 

what am I doing wrong? I thought that user:pw@domain would be the same as curl -u user:pw -X GET domain

Is it possible to exploit php curl?

I have been working on a project and I wonder if its possible to exploit curl_exec function while in php. Scenario: I have the php script that checks a domain for me, but the curl is not secured, can it be exploited via the request, ex: script.php?action=websiteup&token=apikey&target=EVIL CODE HERE , I have googled around but did not find anything that could answer my question, found some theories but nothing helped me.


case "websiteup":              $  ch = curl_init($  _GET['target']);             curl_setopt_array($  ch, array(                 CURLOPT_RETURNTRANSFER => true,                 CURLOPT_HTTPHEADER     => array(                 ),                 CURLOPT_TIMEOUT        => 15,                 CURLOPT_CUSTOMREQUEST  => "HEAD",                 CURLOPT_REFERER        => $  _GET['target'],                 CURLOPT_USERAGENT      => "Mozilla/5.0 ;Windows NT 6.1; WOW64; AppleWebKit/537.36 ;KHTML, like Gecko; Chrome/39.0.2171.95 Safari/537.36",             ));             curl_exec($  ch);              $  response['message'] = array(                 "code" => curl_getinfo($  ch, CURLINFO_HTTP_CODE),                 "time" => curl_getinfo($  ch, CURLINFO_CONNECT_TIME),             ); 

Any ideas how can it be exploited and secured ?

cURL timeout error 28 in Site Health and Sucuri SiteCheck

I run a server hosting multiple WordPress installations with the iThemes Security Pro plugin installed. One of the things that this plugin does is it uses Sucuri SiteCheck to scan the site for vulnerabilities:

Recently, SiteCheck has been failing on all of my sites, reporting the following error:

Unable to properly scan your site. Timeout reached 

Coincidentally, the new Site Health WordPress Tool has also been reporting the following error on all my sites:

The REST API is one way WordPress, and other applications, communicate with  the server. One example is the block editor screen, which relies on this to  display, and save, your posts and pages.  The REST API request failed due to an error. Error: [] cURL error 28: Connection timed out after 10000 milliseconds 

I suspect that the issues are related, but I don’t know where to start to fix this issue. I have both Fail2Ban and ModSecurity enabled on my server and on Apache respectively, but the problem still persists when I turn off the services.

Will appreciate if someone could help pinpoint possible issues. SiteCheck has always worked on my server without a hitch.

Cron job not working (CURL)

I have a server using Linux 19.04

First I did: nano /etc/crontab

The below things were already in the file

SHELL=/bin/sh PATH=/usr/local/sbin:/usr/local/bin:/sbin:/bin:/usr/sbin:/usr/bin  # m h dom mon dow user  command 13 *    * * *   root    cd / && run-parts --report /etc/cron.hourly 18 00   * * *   root    test -x /usr/sbin/anacron || ( cd / && run-parts --report /etc/cron.daily ) 33 00   * * 7   root    test -x /usr/sbin/anacron || ( cd / && run-parts --report /etc/cron.weekly ) 43 00   1 * *   root    test -x /usr/sbin/anacron || ( cd / && run-parts --report /etc/cron.monthly ) # 

I added this one at the end before # line

*/1 * * * * curl -s "" > /dev/null

PHP curl extension not enabling

I’m using the ondrej/php PPA and have been for a while with no issues but I recently upgraded to PHP 7.3 and I’ve hit a snag.

I uninstalled all reference of the previous version and installed 7.3 and everything else is working fine but curl is not showing up in my phpinfo, but it is installed and according to the phpinfo it can see the curl.ini file

I even tried reinstalling the curl extension but no dice, still not showing up.

It’s not showing up for either apache or cli.

I’m at a lost on what I can do to get curl working, I need it for one of my web apps.