Cyber Security & Dot Net Security

Saturday, February 26, 2011

Google SERPS - Mystery search algorithm gets Speed criteria


Google has announced that it will be adding the site speed or loading time of a website as  a criteria for its Site speed as a new criteria ? - www.theprohack.comsearch  rankings. this was indicated in Google’s last post in December and now its been formally announced by Amit Singhal, and Matt Cutts – Google's principal search quality team. Site speed as a new parameter reflects “how quickly a site responds to web requests" . This change is has been adopted to make this world a happier place
"Speeding up websites is important — not just to site owners, but to all Internet users. Faster sites create happy users and we've seen in our internal studies that when a site responds slowly, visitors spend less time there,"
Faster websites reduce operating costs, improve user experience and overall make internet a more habitable place. But as there are always two faces of a coin,some webmasters are just not finding site speed a solid idea. What about websites that have advertisements ? they will obviously load slower than websites with no advertisements and with plain html. What about websites with flash content ? Worse even,the Google Adsense and Google Adwords code is known to slow a website. Would that ultimately affect a website’s rankings ?
Google's New Search algorithm - www.theprohack.com
Google has provided a list of free tools to measure speed of a website. Tools like Google Pagespeed,Yahoo’s Yslow are provided to measure website’s speed. Google might use Google toolbar to measure website speed, but is it a reliable measure ? Further, the Google duo commented
“While site speed is a new signal, it doesn't carry as much weight as the relevance of a page. Currently, fewer than 1% of search queries are affected by the site speed signal in our implementation and the signal for site speed only applies for visitors searching in English on Google.com at this point. We launched this change a few weeks back after rigorous testing. If you haven't seen much change to your site rankings, then this site speed change possibly did not impact your site.”

Basics of Javascript Injection


JavaScript is a widely used technology within websites and web based applications. JavaScript can be used for all sorts of useful things and functions. But along with this comes some additional security issues that need to be thought of and tested for. JavaScript can be used not only for good purposes, but also for malicious purposes.JavaScript injection is a nifty little technique that allows you to alter a sites contents without actually leaving the site.This can be very usefull when say, you need to spoof the server by editing some form options.JavaScript injection is a fun technique that allows you to change a websites content without leaving the site, reloading the page, or saving the site to your desktop. It can be very useful when you need to change hidden data before you send it to the server. Let’s start with some basic injection techniques.

I. Injection Basics
JavaScript injections are run from the URL bar of the page you are visiting. To use them, you must first completly empty the URL from the URL bar. That means no "http://" or whatever.
JavaScript is run from the URL bar by using the javascript: protocol. In this tutorial I will only teach you the bare bones of using this, but if you are a JavaScript guru, you can expand on this using plain old JavaScript.
The two commands covered in this tutorial are the alert(); and void(); commands. These are pretty much all you will need in most situations. For your first JavaScript, you will make a simple window appear, first go to any website and then type the following into your URL bar:

javascript:alert(’Hello, World’);
You should get a little dialog box that says “Hello, World”. This will be altered later to have more practical uses.
You can also have more than one command run at the same time:
javascript:alert(’Hello’); alert(’World’);
This would pop up a box that said ‘Hello’ and than another that says ‘World’.

II. Cookie Editing
First off, check to see if the site you are visiting has set any cookies by using this script:
javascript:alert(document.cookie);
This will pop up any information stored in the sites cookies. To edit any information, we make use of the void(); command.
javascript:void(document.cookie=”Field = myValue”);
This command can either alter existing information or create entirely new values. Replace “Field” with either an existing field found using the alert(document.cookie); command, or insert your very own value. Then replace “myValue” with whatever you want the field to be. For example:
javascript:void(document.cookie=”Authorized=yes”);
Would either make the field “authorized” or edit it to say “yes”… now whether or not this does anything of value depends on the site you are injecting it on.
It is also useful to tack an alert(document.cookie); at the end of the same line to see what effect your altering had.

III. Form Editing
Sometimes, to edit values sent to a given website through a form, you can simply download that html and edit it slightly to allow you to submit what you want. However, sometimes the website checks to see if you actually submitted it from the website you were supposed to. To get around this, we can just edit the form straight from javascript. Note: The changes are only temporary, so it’s no tuse trying to deface a site through javascript injection like this.
Every form on a given webpage (unless named otherwise) is stored in the forms[x] array… where “x” is the number, in order from top to bottom, of all the forms in a page. Note that the forms start at 0, so the first form on the page would actually be 0, and the second would be 1 and so on. Lets take this example:
<form action=”http://www.website.com/submit.php” method=”post”>
<input type=”hidden” name=”to” value=”admin@website.com”>
Note:Since this is the first form on the page, it is forms[0]
Say this form was used to email, say vital server information to the admin of the website. You can’t just download the script and edit it because the submit.php page looks for a referer. You can check to see what value a certain form element has by using this script:
javascript:alert(document.forms[0].to.value)
This is similar to the alert(document.cookie); discussed previously.
In this case, It would pop up an alert that says “admin@website.com”
So here’s how to Inject your email into it. You can use pretty much the same technique as the cookies editing shown earlier:
javascript:void(document.forms[0].to.value=”email@nhacks.com”)
This would change the email of the form to be “email@nhacks.com”. 
Then you could use the alert(); script shown above to check your work. Or you can couple both of these commands on one line. 
That completes this post about JavaScript injection as you can see all kinds of fun things can be done with these techniques. Use your imagination and with a little work you can test your site and keep it secure from malicious hackers.

XSS vulnerabilities in top websites – What were they thinking ?


Recently, i tried to have a paradigm shift of some sort, to move from ASM to web technologies, Excess of XSS !!!! - theprohack.com& landed directly (again)  on XSS vulnerabilities. Being a fan of Rsnake, the God of XSS, I always wanted to learn a bit more about web app security scenario & I tried my hands on some XSS vulnerabilities & how they can be used to manipulate sessions.
The results ? well, I found some vulnerabilities in some prominent sites which I am disclosing here..,the deal is that I tried to contact the vendors (more on this later) to notify them of vulns. Remember hackable government & educational websites, consider this as the spiritual follow-up of the article.
Disclaimer 
I HAVE NOT HACKED ANY OF THE SITES AND THEIR DATABASES IN ANY WAY,JUST TESTED WEBSITES FOR VULNERABILITIES. I TESTED THEM AND FOUND ERRORS WHICH MAY/MAY NOT BE DISCLOSED HERE AND IN NO WAY ANY ONE CAN SUE ME FOR THIS AS I DID AND MEANT NO HARM TO THE DATA OF CONCERNED ORGANIZATIONS.
BY READING THIS ARTICLE YOU AGREE WITH THE DISCLAIMER.
IF YOU AGREE WITH THIS AGREEMENT,CONTINUE READING ELSE IMMEDIATELY LEAVE THIS WEBSITE.
Here we go, it all started with www.in.com [ ALEXA RANK 298 ] which had a simple XSS vulnerability to display cookies , inject code & God knows what else, I tried to contact the technical team in vain, then contacted them via a simple feedback form. Waiting for their response as of now..I moved on then. Please note that I have censored all the URLs & script details so as to protect attack originating after this article :P
in.com was a piece of cake - theprohack.com

I never liked Rediff [ ALEXA RANK 128 ]  & their services..too much ads to digest for me, again, it was easy to inject .
 rediff XSSD..again & again & again - theprohack.com
dl4all.com [ ALEXA RANK 1517 ]  was no exception, a simple search & thats all.
dl4all xss for everyone- theprohack.com
shaadi.com [ ALEXA RANK 935 ]  the premium matrimonial portal of India has XSS flaws..
shaadi.com had seen better days - theprohack.com
shaadi.com had seen better days - theprohack.com






A leading social networking website Itimes.com [ ALEXA RANK 6024 ]  was no better & a nobrainer.
itimes was a nobrainer - theprohack.com
Indiatimes.com [ ALEXA RANK 168 ] anyone ?
indiatimes xss - theprohack.com
enough..as expected, I tried to contact all the support staff before releasing this article. My point ? what happens when there is no competent technical staff to handle the issue (I am looking at you AXISBANK !)  have a look at it
now that makes me angry..very angry - theprohack.com
great..now that's what we wanted..more flaws in “secure” websites which pledge for our privacy. For the record, XSS flaws are independent of encryption & the so called layman lock mechanisms as the application behavior remains the same. I tried to contact authorities at AXIS bank but they were asking for my bank account number, to contact nodal officer, to contact xyzabc blah blah..but no support / tech staff.
Lets have a look at the alexa rank of the above websites -
www.rediff.com - 128
www.In.com - 298
www.Indiatimes.com - 168
www.shaadi.com – 935
www.dl4all.com – 1517
www.axisbank.com  - 2330
www.itimes.com – 6024
again, the above sites are the head honchos of social networking, downloads, have a lot of data in their hands & are vulnerable to XSS. Why they still have no technical feedback team is beyond my belief. Except itimes, i wasn't able to find a bug reporting facility in any of the sites mentioned above. Now that's pure genius ! Just what were they thinking ?!! Cant they learn from some good examples ?
what i am doing ? to quote Rsnake
“ How many compromises of data security, that you are aware of, have been disclosed to the public as a percentage? “


ditto here..the websites are not safe, & so their claims. It took me about 30 minutes to write this post including the time to try XSS on the websites, ( excluding the email contact with the authorities where possible ). XSS/CSRF is the modern nightmare ofsecurity for any website today, prominent websites are constantly under attack & the recent cases which i have heard as of now, lot of bank websites were targets of CSRF based attacks, phishing & XSS. Imagine what a skillful attacker can do with a lot of time & patience (& a reason perhaps).
pray & spray..& trust no one.

Tuesday, February 15, 2011

Cyber Security: Common Control Panel Applets in Windows

Cyber Security: Common Control Panel Applets in Windows: "access.cpl - Accessibility Applet appwiz.cpl - Add/Remove Programs Applet console.cpl - Console Applet timedate.cpl - Date and Time Applet ..."

Common Control Panel Applets in Windows


  • access.cpl - Accessibility Applet
  • appwiz.cpl - Add/Remove Programs Applet
  • console.cpl - Console Applet
  • timedate.cpl - Date and Time Applet
  • desk.cpl - Display Applet
  • fax.cpl - Fax Applet
  • hdwwiz.cpl - Hardware Wizard Applet
  • irprops.cpl - Infrared Port Applet
  • intl.cpl - International and Regional Applet
  • inetcpl.cpl - Internet Settings Applet
  • joy.cpl - Joystick Applet
  • liccpa.cpl - Licensing Applet
  • main.cpl - Mouse and Keyboard Applet
  • mlcfg32.cpl - Mail Applet
  • mmsys.cpl - Sound and Multimedia Applet
  • modem.cpl - Modem and Phone Applet
  • ncpa.cpl - Network and connectivity Applet
  • netcpl.cpl - Network and Dial-up Connectivity Applet
  • nwc.cpl - Netware Client Applet
  • odbccp32.cpl - ODBC Applet
  • devapps.cpl - PC Card Applet
  • ports.cpl - Ports Applet
  • powercfg.cpl - Power Management Applet
  • srvmgr.cpl - Server Manager Applet
  • sapi.cpl - Speech Properties Applet
  • sysdm.cpl - System Applet
  • telephon.cpl - Telephony Applet
  • nusrmgr.cpl - User Manager Applet

Friday, December 24, 2010

C# Parse Meta Tags

You may come across an instance in your C# and ASP.NET programming where you need to 
download an external webpage and parse the meta tags... specifically, the "Title," 
"Meta Description," and "Meta Keywords."

The method below will show you how to:

    * download an external webpage
    * parse the meta title
    * parse the meta description
    * parse the meta keywords

The parsing is done using regular expressions.

NOTE: This may not be the best way of doing this, but it is a solution that you can use.

view sourceprint


using System;
using System.Collections.Generic;
using System.Text;
using System.Net;
using System.Text.RegularExpressions;
using System.IO;
 
namespace Tim.Examples.Classes
{
    public class WebMetaData
    {
        public string metaTitle;
        public string metaDescription;
        public string metaKeywords;
 
        public bool GetMetaTags(string url)
        {
            try{
                //get the HTML of the given page and put into a string
                string html = AcquireHTML(url);
 
                if (GetMeta(html))
                {
                    return true;
                }
                else
                {
                    return false;
                }
            }
            catch(Exception ex)
            {
                // do something with the error
                return false;
            }
        }
 
        private string AcquireHTML(string address)
        {
            HttpWebRequest request;
            HttpWebResponse response = null;
            StreamReader reader;
            StringBuilder sbSource;
 
            try
            {
                // Create and initialize the web request  
                request = System.Net.WebRequest.Create(address) as HttpWebRequest;
                request.UserAgent = "your-search-bot";
                request.KeepAlive = false;
                request.Timeout = 10 * 1000;
 
                // Get response  
                response = request.GetResponse() as HttpWebResponse;
 
                if (request.HaveResponse == true && response != null)
                {
                    // Get the response stream  
                    reader = new StreamReader(response.GetResponseStream());
 
                    // Read it into a StringBuilder  
                    sbSource = new StringBuilder(reader.ReadToEnd());
 
                    response.Close();
 
                    // Console application output  
                    return sbSource.ToString();
                }
                else
                    return "";
            }
            catch (Exception ex)
            {
                response.Close();
                return "";
            }
        }
 
        private bool GetMeta(string strIn)
        {
            try
            {
                // --- Parse the title
Match TitleMatch = Regex.Match(strIn, "<title>([^<]*)</title>, 
RegexOptions.IgnoreCase | RegexOptions.Multiline);
                metaTitle = TitleMatch.Groups[1].Value;
 
                // --- Parse the meta keywords
 Match KeywordMatch = Regex.Match(strIn, "<meta name=\"keywords\" 
content=\"([^<]*)\">", 
RegexOptions.IgnoreCase | RegexOptions.Multiline);
                metaKeywords = KeywordMatch.Groups[1].Value;
 
                // --- Parse the meta description
 Match DescriptionMatch = Regex.Match(strIn, "<meta name=\"description\"
 content=\"([^<]*)\">", RegexOptions.IgnoreCase | RegexOptions.Multiline);
                metaDescription = DescriptionMatch.Groups[1].Value;
 
                return true;
            }
            catch (Exception ex)
            {
                // do something with the error
                return false;
            }
        }
 
    }
}

Creating an AJAX Enabled WebParts in ASP.NET 2.0


1. Open Visual Studio 2005 and Create new Website
2. Select ASP.NET AJAX-Enabled Web Site then press OK (See Figure A)
Figure A
3. As what you have noticed, the ScriptManager Control is automatically added to your page. ScriptManager handles the Ajax functionality in the ASP.NET page.
4. Drag Updatepanel control below the ScriptManager (See Figure B)
Figure B.
5. From the Visual Studio Toolbox ,drag a WebPartManger inside the Updatepanel control
6. Drag two WebPartZone below the WebPartManager (See Figure C)
Figure C.
7. Drag and placed a TextBox control in the WebpartZone
8. Compile the Appication and Run the Website
9. Drag a webpart in different zones and observe what happens

As what you have noticed, you can drag and drop a webpart only once and if you would try to drag it again it will just highlight the header of the webpart that you are trying to drag. Basically by default, webparts drag and drop functionality does not supported in UpdatPanel control but they (ASP.NET team) provided a workaround on how to move webparts within UpdatPanel control.

Note: This workaround provided will ONLY work for IE browser.You need to use Visual studio 2008 / VWD 2008 with latest version of the Microsoft ASPNET Futures (AJAX Control Toolkit 3.5) in order for you to make the drag and drop feature work for all major browsers. See this link below: http://geekswithblogs.net/dotNETvinz/archive/2008/09/12/ajax-enabled-webparts-and-firefox-drag-and-drop.aspx
The Problem
ASP.NET Web Parts Drag and Drop feature and Drop down verbs menu does not operate correctly inside of a Microsoft AJAX 1.0 UpdatePanel.

Why it doesn't work?
The WebPartManager is responsible for registering an include and start up script. This script provides Web Parts and zones with various client side functionality including drag and drop and
drop down verb menus.

When a control is placed inside of an Update Panel, the script is rendered and ran on the first render, but not on subsequent renders. Due to this, the client side functionality fails.

The Workaround
The solution is simple. Inherit the WebPartManager, override the RenderClientScript Method and render the client scripts using the System.Web.UI.ScriptManager instead of the System.Web.UI.ClientScriptManager.

The System.Web.UI.ScriptManager informs Ajax of the registered client scripts and ensures that they are rendered out and executed whenever an UpdatePanel is refreshed. To achieve this, follow the steps below:

1. Right click on the Project and Add a Reference (See Figure D)
Figure D.
2. Click the Add Reference. You should be able to see the add reference window.
3. Click on the Browse Tab (See Figure E)
 Figure E.
4. Browse the Sample.Web.UI.WebParts.dll file. I have attached a zipped file together with this article for the dll.
5. Add the following Tag Mapping to the specified section of your web.config

 <configuration>
<system.web>
    <pages>
          <tagMapping>
               <add tagType="System.Web.UI.WebControls.WebParts.WebPartManager" 
mappedTagType="Sample.Web.UI.WebParts.WebPartManager, Sample.Web.UI.WebParts"/>
               </tagMapping>
    </pages>
</system.web>
</configuration>


6. Compile the Application and Run again.
7. The result can be seen in the screen shot below:



As what you have notice now, you can drag and drop webparts in different WebPartZones without refreshing the page.