Burp Proxy vs MITM

I have recently started using Burp as a proxy for hunting bugs on websites and I see many submissions where people have intercepted and modified requests/responses to exploit certain logic flaws in web applications. However, this is possible only because we have installed Burp’s certificate in our browser that allows it to decrypt the traffic to and from the web application. However, in a realistic scenario, the attacker would have to conduct a MITM attack to intercept/modify traffic. This makes me wonder what the point is of traffic interceptions using Burp.

Is there a limit to total number of CAPTCHA tries? Is same proxy used for all tries?

I know some systems have a limit and will lock out a user after x CAPTCHA fails.
But…I also know that this might only affect things if one proxy is used for consecutive solve attempts on fails. Which way does SER handle consecutive solve attempts: same proxy or different?
AND, if same proxy is used, what’s the highest number of total tries suggested?  I have used various external solvers, and even the ones at the end of the list seem to get used, so it seems having a high number of tries does work.
Thanks…

Proxy Failover Advice

Overview

I’m currently deploying a pair of HA firewall devices that will act as a transparent forward proxy (traffic will be directed through the proxy via routing rather than configuring a proxy URL on the client machines)for outbound traffic. I have the high availability configuration in place and working and I can see that the TCP sessions are shared across both devices. When failover is triggered, the passive device will assume the IP addresses (actually the whole network adaptor is moved across as this is on AWS) of the previously active instance.

Issue

As a test, I setup a web server and created a large html file. I then used a client machine to retrieve this file using wget and curl (via my proxies) and during the file download I performed a manual failover. When I performed the failover the wget (same happened with curl) download got stuck. I then added connection timeouts and the wget command timed out and then restarted the download which worked fine although I could see that a new TCP session was created. One thing to note is that this is a Cloud setup where failover times are a lot slower than on-premise high-spec devices so it can take between 15 – 60 seconds for failover to complete. I’m trying to ensure that my deployment will not impact highly on applications which will mainly be sending HTTP traffic.

Questions

  1. Is it reasonable to expect a HTTP download to continue after a failover or should the client use timeouts and retries to start the download again?

  2. Am I likely to need application teams to change their timeout and retry settings? What is considered normal for timeout and retry settings for applications that regularly send API requests?

  3. Is TCP session sharing only really suitable for pure TCP connections (database connections etc.) and is there any real use for HTTP connections?

Proxy Error, DNS lookup failure for -SOS-

Hello, everyone,

please I need your professional help

I have built a new page (3 months old), and because my page speed was too low, I wanted to perform this. But I made a big mistake! I came across a platform called "Ezoic.com" and followed their instructions and changed my server entries in my DNS provider. And that was the lockdown. My website is no longer accessible, neither on the domain name nor on wp-admin. I don't know what to do anymore. I restored the settings on my server, but…

Proxy Error, DNS lookup failure for -SOS-

Chrome extension differences: Urban Shield VS Urban Free VPN proxy Unblocker

What’s the difference between these two Chrome extensions, which provide VPN functionality for browsing via Chrome:

Urban Shield: https://chrome.google.com/webstore/detail/urban-shield/almalgbpmcfpdaopimbdchdliminoign?hl=en

Urban Free VPN proxy Unblocker: https://chrome.google.com/webstore/detail/urban-free-vpn-proxy-unbl/eppiocemhmnlbhjplcgkofciiegomcon

They are both developed by the same company, but I couldn’t find any explanation regarding the differences between the two.

Does routing internet traffic via VPN through a company proxy gain any more security?

There is currently an ongoing discussion in our company about what security measures to put in place regarding workstation access to the company network and the internet.

Situation:

  • Employees have Linux laptops with encrypted SSDs
  • on these SSDs is the intellectual property of the company
  • Employees have unrestricted root access to these machines
  • AntiVirus is installed and running

Goal:

  • Have protection against theft of the intellectual property of the company while still being able to work from anywhere in the world

Current idea:

  • Use VPN to tunnel all network traffic (including internet traffic) through the company
  • Do not allow direct internet access via VPN but rather enforce that a proxy server has to be used.

Question:

Does the additional proxy server for internet access provide more security than it (potentially) costs in the effort? (additional client configuration effort programs and services, …)

Laptop <-> VPN <-> Proxy <-> Internet vs. Laptop <-> VPN <-> Internet

Brainstorming:

If the laptop is compromised (backdoor running). How does VPN protect the data anyway if the user has root access and can change network configuration (routes, iptables, …) as he pleases. What additional security does a company proxy give?

Proxies Passed in Scrapebox Proxy Checker But Failing in GSA SER

Hi Sven,

I need your clarification about an issue I’m having at the moment.

I currently use SB to check proxies I scanned for through port scanning.

My problem is this, for instance, after checking the proxies in SB I might get like 400 working proxies. I test them again within a 5-minute window and they are still working.

I then take them to SER to check and all fail except may be for 3 or 5 proxies.

When I immediately go back to SB to check the proxies, only about 7 will fail but when I re-check in SER again, the same issue persist.

What could I be doing wrong?

Or are there specific proxy types that SB works with than those GSE SER uses?

If yes, how do I go about searching for GSE SER proxies in particular.

Thanks a lot for your usual understanding.