Sunday, December 9, 2012

Webcam Hacking using Google

There are thousands of unprotected webcams available online, Since many Webcams use known protocols to transmit live video streams over the web, it’s often very easy to search for publicly accessible webcams
1.First of all open your Internet browser and visit Google.com
2.Search on Google for the following Keyword “inurl:/view.index.shtml
3. Next Choose a webcam and Enjoy
Google Dorks
Here is the list of Few Google Dorks used for this purpose:

inurl:/view.shtml
intitle:”Live View / – AXIS” | inurl:view/view.shtml^

inurl:ViewerFrame?Mode=

inurl:ViewerFrame?Mode=Refresh

inurl:axis-cgi/jpg

inurl:axis-cgi/mjpg (motion-JPEG)

inurl:view/indexFrame.shtml

inurl:view/index.shtml

inurl:view/view.shtml

liveapplet

intitle:”live view” intitle:axis

intitle:liveapplet

allintitle:”Network Camera NetworkCamera”

intitle:axis intitle:”video server”

intitle:liveapplet inurl:LvAppl

intitle:”EvoCam” inurl:”webcam.html”

intitle:”Live NetSnap Cam-Server feed”

intitle:”Live View / – AXIS”

intitle:”Live View / – AXIS 206M”

intitle:”Live View / – AXIS 206W”

intitle:”Live View / – AXIS 210″

inurl:indexFrame.shtml Axis

inurl:”MultiCameraFrame?Mode=Motion”

intitle:start inurl:cgistart

intitle:”WJ-NT104 Main Page”

intext:”MOBOTIX M1″ intext:”Open Menu”

intext:”MOBOTIX M10″ intext:”Open Menu”

intext:”MOBOTIX D10″ intext:”Open Menu”

intitle:snc-z20 inurl:home/

intitle:snc-cs3 inurl:home/

intitle:snc-rz30 inurl:home/

intitle:”sony network camera snc-p1″

intitle:”sony network camera snc-m1″

site:.viewnetcam.com -www.viewnetcam.com

intitle:”Toshiba Network Camera” user login

intitle:”netcam live image”

intitle:”i-Catcher Console – Web Monitor”

WCIT - So Far So Good - Recap of Week One

Paul Budde 
The apocalypse that didn't happen
So far the world has survived WCIT-12 and the internet has not been taken over by anybody. So, in the end, what was all the fuss about?
Those who have followed my reporting on these issues from the very beginning more than a year ago — long before the media frenzy on this topic started (see: The Governance of the Internet) — will have seen that we never took the sensational approach. We fully understood the issues that were emerging, but at the same time we could also place them in the right context, to explore how they should be addressed. Even at that early stage we suggested cooperation with ICANN and it was good to see that this indeed eventuated at WCIT-12.
We also indicated from the beginning that we needed to separate the various issues in order to see how they could be best addressed. This is what the first week of WCIT-12 has been all about. However I did not fully appreciate at that time what would be involved in 'herding 193 cats' — the number of countries involved in the ITU — in the same direction. This required some amazing diplomatic skills.
The largest scaremongering came from the USA, particularly through campaigns organised by Google; but when it became clear that the internet apocalypse wasn't going to happen people started to question Google's own agenda in all of this.
The frenzy was also fueled by the hard line taken by the US Administration on issues such as 'the take-over of the internet'. While it was understood that this was part of a posturing strategy it certainly fed the media frenzy in the USA.
What was conveniently left out of these discussions was the fact that proposals made by member states are not the same as policies accepted by the full ITU membership. It was these proposals that were promoted by the attackers as evidence that the end of the world was at hand.
Consensus is building
Now, at the end of the first week of WCIT-12, it is clear that, as expected, a far more conciliatory approach is being taken by all delegates.
The separation of the issues referred to above did, in fact, take place and the discussion was narrowed down to what WCIT is all about — updating the International Telecommunications Regulations (ITRs).
As indicated, there are a few things that I have come to appreciate during week one.
First of all, the language that is used; and the definitions. When you think about it, terms such 'operating agencies', 'security' and 'ICT', cover an extensive variety of concepts and if you want to include them in international treaties you need to be damn sure that everybody uses the same language. Another complication is that in different languages different translations — and often more than one — apply to these concepts. Listening to all the delegates involved in this discussion made me realise how important that is, and that it is not something that can be easily dealt with. Furthermore, some countries have different political agendas — this becomes clear when you listen to the statements made by those countries.
Amazingly, people do eventually come together on most of the issues.
Political posturing and diplomacy
In this context I also appreciated the tough stand that countries such as the USA, Canada, Australia and the European Union took on some of the issues. In the end that worked to clarify matters somewhat, as their strong language made everybody focus on the right issues.
And at the end of the week the US delegation officially declared 'so far so good'. This came as something of a shock to some of the media, who now have to back-peddle on their doom and gloom reporting.
How to move beyond WCIT-12
Are we there yet? No, of course not. There is one more week to go. But I am prepared to stick my neck out and say that there will most likely be a good outcome.
The ITRs, as they exist since we developed them and put them in place in Melbourne in 1988, enabled the world to create the internet. They will stay, albeit fine-tuned a bit; and as proven tools they can, and should, be used by the non-connected or under-connected communities and countries to their own benefit.
As argued in the UN Broadband Commission and in its discussions with various governments around the world, it is now up to these countries to develop policies and strategies that are aimed at obtaining the social and economic benefits of the hard work that has led to the successful telecoms infrastructure from which the whole world is benefiting.
Thanks to organisations like the ITU, who have been working on this since 1865, we can make a telephone call to anybody in the world and access the internet from wherever we are. Compare that with the level of standardisation in the IT world, or in any other sector for that matter. There will be few people who would claim that this is not a great thing, so let us make sure we continue along that road.
Once WCIT-12 is brought to a good conclusion we can start looking at how we can assist the under-connected to become part of the global digital economy, so they can start reaping their own social and economic benefits and thus fund the investments for the infrastructure they need.
Governments should perhaps use some of their often extensive USO funds and lucrative spectrum auctions to channel those incomes to national broadband infrastructure, and work together with their industry to develop their own national broadband plans.
The multi-stakeholders platform established at WCIT-12 should have as a priority assisting countries to build up capacity and human skills to make this happen.
Follow-up ITU meetings and conferences over the next few years can take these issues further and help in developing government and industry policies and strategies, as well as business models that will allow these countries to create their own incomes, rather than depending on handouts coming from the ageing international accounting rate structures. Obviously these old structures cannot be demolished overnight; a transitional period is needed and, again, this is where the rest of the world can assist.
By Paul Budde, Managing Director of Paul Budde Communication. Paul is also a contributor of the Paul Budde Communication blog located here.

Thursday, December 6, 2012

Hackers, Crackers & Script Kiddies

Real hackers are few, highly skilled, and probably not interested in you, but the fast growing "script kiddie" hobby assures your systems will be attacked. 



As "point and click" interfaces have "dumbed down" computer users and administrators, the same has happened in hacker land, to the point the "dumbed down" are not allowed the title, they are called "script kiddies". Script kiddies download automated hacking tools from Internet sites and launch them against random blocks of IP addresses - looking for unprotected computers to play with. Then they use similar low skill tools to do whatever they please with your computer. Problem is, there are many thousands of script kiddies, and more every day.
RULE: Security through Obscurity no longer works - at all. Not against an intense, but totally random attack. If you are vulnerable, someone already knows, perhaps many people, they just haven't had time for you yet.
All this is not to say real hackers aren't still hard at work. Someone creates those easy to use tools, after all.
Hackers vs. Crackers - In the nerdy culture, a "Hacker" is a highly skilled computer geek who does "great hacks" (generally clever computer code). You will be told in no uncertain terms that the people who break into systems are called "crackers", that "hackers" are honest folks who would never, never do such a thing (even though they have no respect whatever for authority, business, government, laws, property, "suits", grooming, personal hygiene, mom or apple pie). OK, some wouldn't. Of course every "cracker" referrs to himself as a "hacker". Common usage lumps the whole lot under the term "hacker".

Dell's Ubuntu XPS 13 should worry Microsoft


dell-xps-13-ultrabook
TIN BOX FLOGGER Dell's decision to put arguably its best laptop on sale preloaded with Ubuntu Linux shows not only how far desktop Linux has come but how far Microsoft has fallen.
Dell announced its Project Sputnik earlier this year to a warm if not ecstatic reception. The firm had preloaded Linux onto its consumer machines before but they were hard to find and on forgettable machines. However with the XPS 13 the firm is not only loading Linux on its most high profile laptop but showing that Microsoft's operating system isn't the only choice in town for OEMs and consumers alike.
From a Linux community perspective, Dell's XPS 13 comes with Intel's ultrabook branding, which might mean little to those who actually read and understand specifications but means a lot to the customer in the street who is bombarded with Intel's ultrabook marketing message. Dell might be pitching its Ubuntu XPS 13 laptop as a developer's machine rather than one for Facebook and Youtube users, but that isn't a bad idea in the long run either.
Dell's decision to price the Ubuntu XPS 13 $250 more than its Windows counterpart will no doubt generate debate, with some asking where the so-called Microsoft tax has gone.
However some of Dell's price hike can be explained by the 256GB SSD in the Ubuntu XPS 13, double that on the Windows machine. The rest of the laptop's price can be written off thanks to the lack of bloatware that Dell and other OEMs get compensated to clog new machines with.
The perception that Dell simply downloaded a Ubuntu ISO off Canonical's website and loaded it as opposed to paying Microsoft some cash for a DVD simply isn't accurate on several levels. Enterprise Linux vendors such as Red Hat and Suse do not win business because the operating system has a lower sticker price, but rather lower maintenance cost, better reliability and in the case of Linux, perhaps the ability to replicate the functionality of Microsoft's operating system.
From Dell's point of view, it has to support Ubuntu much in the same way that it supports Microsoft's Windows. Not only does the firm have to make sure that all of the features on its XPS 13 work with Ubuntu but it must ensure that its support staff can deal with users, whether they be technology literate developers or not, running Ubuntu after years of spoon-fed Windows support.
The reason why Dell, HP, Acer and others can offer cheap hardware is because it is subsidised by the bloatware peddlers. For those that value their time, Dell's Ubuntu XPS 13 won't require the ritual that follows the purchase of almost every store-bought Windows machine, that is, to wipe the hard drive and reinstall a bloatware-free version of the operating system.
So Dell's decision to price its Ubuntu XPS 13 a bit higher than the Windows version might not be nice but it can be justified and one hopes that in time price parity will be achieved. However the firm's real problem is that the XPS 13 is starting to show its age, especially given that it is still Dell's showcase laptop no matter what operating system it is running.
Dell could do with upgrading the screen on the XPS 13, which it markets as having just HD 720p resolution, or more accurately 1366x768 resolution. Aside from the boost in storage, the firm has done little to evolve what is otherwise a surprisingly premium design.
Specification issues aside, Dell's decision to market the Ubuntu XPS 13 to developers is a very smart move and one that could pay off in the long term. The firm's literature on the machine says it bundles a number of developer tools that should, in theory at least, help developers get the plumbing of their development environments sorted quickly. Some developers will scoff at Dell's bundle but it shows that the firm is focused on what the Ubuntu XPS 13 will be used for and it certainly beats having a load of toolbars and a trialware antivirus constantly thrashing the hard drive.
Although the Linux community has never been short on developers, having more developers build applications and services on Linux will hurt Microsoft and its long standing strategy of pushing its own programming languages and frameworks, such as .Net, C#, ActiveX and Jscript. And as Dell and its rivals look to push Linux into machines that use consumer hardware, not Xeon processors and SAS hard drive controllers, it will force hardware vendors such as Intel to provide equal levels of Linux kernel support on both consumer and enterprise hardware.
What will really hurt Microsoft in the long run is if Dell or its rival OEMs and third party developers work on building applications that make use of their own services, such as Dell Cloud or music stores. Effectively this will mean that not only will Dell avoid the Microsoft tax but it will have a high quality operating system that isn't bogged down with Windows and bloatware that doesn't even promote its own services, which can offset the loss in revenue from not preloading gigabytes of useless software.
Dell's XPS 13 might not be the best laptop on the market right now, especially alongside Lenovo's X1 Carbon or Apple's Macbook Air, but that the firm is willing to preload Linux on its showcase product just weeks after the launch of Windows 8 must be worrying for Microsoft. Dell has said that it is looking at expanding availability of the Ubuntu XPS 13 outside the US, and while it is not expected to be a sales hit thanks to it being pitched at developers, it does put a stake in the ground for both Dell and Microsoft. ยต

The Inquirer (http://s.tt/1vH6I)

Wednesday, December 5, 2012

How to spoof a MAC address


Takeaway: MAC address filtering for wireless networking isn’t real “security”. Anyone who pays any attention to current trends in wireless security at all should know that MAC filtering is less effective than WEP — and that WEP can be cracked almost instantly these days with commonly available tools. This doesn’t mean MAC filtering is useless. [...]
MAC address filtering for wireless networking isn’t real “security”. Anyone who pays any attention to current trends in wireless security at all should know that MAC filtering is less effective than WEP — and that WEP can be cracked almost instantly these days with commonly available tools.
This doesn’t mean MAC filtering is useless. Its resource consumption is almost unmeasurable, and even if it doesn’t keep out any reasonably knowledgeable security crackers willing to spend a few moments gaining access, it does keep out a lot of automated opportunistic attacks that are aiming solely for the absolute lowest-hanging fruit on the security tree. Since that lowest-hanging fruit consists of the majority of wireless access points, MAC filtering can be of value as a way of turning away the majority of opportunistic attackers.
Don’t rely on MAC filtering alone, however. Please, just don’t. It’s a bad idea. People seem to think “Oh, well, sure a determined attacker can get past it, but not anyone else.” It doesn’t take much determination at all to spoof a MAC address. In fact, I’ll tell you how:
  1. “Listen” in on network traffic. Pick out the MAC address. This can be done with a plethora of freely available security tools, including Nmap.
  2. Change your MAC address.
You can spoof a MAC address when using Nmap with nothing more than a –spoof-mac command line option for Nmap itself to hide the true source of Nmap probes. If you give it a MAC address argument of “0″, it will even generate a random MAC address for you.
For more general MAC address spoofing, your MAC address is trivially reset with tools available in default installs of most operating systems. Here are some examples:
  • Linux: ifconfig eth0 hw ether 03:a0:04:d3:00:11
  • FreeBSD: ifconfig bge0 link 03:a0:04:d3:00:11
  • MS Windows: On Microsoft Windows systems, the MAC address is stored in a registry key. The location of that key varies from one MS Windows version to the next, but find that and you can just edit it yourself. There are, of course, numerous free utilities you can download to make this change for you as well (such as Macshift for MS Windows XP).
All of these techniques can of course be automated by self-propagating malware, and the creation of the malware can even be automated to some extent by existing malware creation “kits”. If that doesn’t convince you that MAC filtering does not provide real security, I don’t know what will.

How the ITU is leading the way to the 20th century

You are likely already aware of the World Conference on International Telecommunications (WCIT) which opened in Dubai on Monday. This two week conference is where a review of the International Telecommunications Rules established by a 1988 treaty is being conducted by representatives of the 178 International Telecommunications Union (ITU) members who are party to it. The ITU, originally formed as an industry association for telegraph operators in the 1800s, has expanded over the years to become a United Nations agency with a membership consisting of nearly 200 countries and more than 700 private organizations. Although only states have votes on the adoption of ITU policy and rules, all members may propose changes.
There have been numerous accusations about secret agendas behind the most significant changes proposed to ITU-T rules which govern wireline communications across the legacy PSTN (Public Switched Telephone Network). Despite the fact nearly all such arguments being charged with political rhetoric and grandstanding, most of them are sadly very accurate. Rather than trying to summarize them all here I'm going to highlight the worst of the worst and provide links to more detailed information on each. We can start with this though. The UN is not trying to take over the Internet. That's not to say that various ITU members are not trying to exert improper regulatory control over it for equally improper reasons. But despite being technically an agency of the UN, the ITU isn't really under their control. In fact the real controlling authority in this case is the 1988 treaty mentioned previously.

The ITU's role in the Internet
ITU Secretary General Hamadoun Toure has claimed that regulation of Internet communication is not an expansion of the agency's authority because their mandate, as mentioned in their own constitution, covers all telecommunication. That's nonsense. The ITU's constitution does, in fact, cover telecommunications but in that context it refers to nothing more than interoperability between international, government regulated PSTN (Publicly Switched Telephone System) networks.

In reality there are basically two goals behind the problematic proposals to expand ITU authority. The first is an attempt by legacy telecom players, including governments with state run telco monopolies, to neutralize market forces to pad their profits. At the same time governments who seek to restrain the flow of information and ideas want to gut the Internet's ability to empower their citizens.

In an opinion piece for Wired last month Toure detailed what their members claim to be aiming for, but even a cursory look at the actual proposals paints a very different picture which mostly boils down to two issues.

Content Filtering
Tourre's first point of emphasis was on cybersecurity:
Many authorities around the world already intervene in communications for various reasons – such as preventing the circulation of pornography or extremist propaganda. So a balance must be found between protecting people's privacy and their right to communicate; and between protecting individuals, institutions, and whole economies from criminal activities.

The problem is while many proposals use the word security in their description it's really just an smokescreen for an obvious agenda of filtering and censorship. They come, not surprisingly, from member states like China and Russia whose attempts to control the free flow of information and communication both through and within their borders are well known. Others originated in regions like the Middle East where social networking has been instrumental in toppling regimes.

Take, for example, Egypt's contribution to the 'security' question (via the Center for Democracy & Technology):
There must be transparency of the routes: on request, Member States must be able to know the routes used, in particular to avoid fraud and to maintain national security. If the [Member State] does [not] have the right to know or select the route in certain circumstances (e.g. for Security reasons), then the only alternative left is to block traffic from such destinations, which is neither logical nor desirable!

That's not security. It's censorship.
Subsidizing telcos
Toure also claimed ITU members were focusing on expanding Internet service to reach more of the world:
The conference will also focus on how ICTs – and particularly broadband – can be highly effective catalysts for sustainable social and economic progress.
Right now, access to this potential is constrained by issues of affordability, with high costs a reality for many users. Related to this is insufficient investment in infrastructure, especially in developing countries.

In reality the big push for revenue is coming from ETNO (the European Telecommunications Network Operators Association), representing the highly profitable European telecom industry. It's nothing more than a demand that others pay for future network upgrades (download full document from WCITLeaks.org):

Operating Agencies shall endeavour to provide sufficient telecommunications facilities to meet requirements of and demand for international telecommunication services. For this purpose, and to ensure an adequate return on investment in high bandwidth infrastructures, operating agencies shall negotiate commercial agreements to achieve a sustainable system of fair compensation for telecommunications services and, where appropriate, respecting the principle of sending party network pays.

In plain English this is nothing more than a rehash of the tired argument ISPs have been making for decades about how companies like Google and Netflix are getting a free ride on their networks. It's just as nonsensical now as it has always been. Without the billions of dollars spent annually developing, deploying, and maintaining search engines, media delivery, cloud storage, and numerous other services ISPs would have no customers for their broadband offerings to begin with. What ETNO is proposing is essentially to shoehorn Internet traffic into a 20th century PSTN model where every hop a packet takes across any network can be metered, measured, and billed to the originating network in order to pad telco profits. If Deutsche Telekom needs to upgrade their infrastructure maybe they should have invested the 480 million euros in profit they declared for the first three months of this year.

And we need the ITU why?
The real question here isn't whether these proposals will be adopted. They won't. In fact it's entirely possible not a single modification to existing rules or recommendations will come out of the WCIT conference because it would require unanimous support from all countries.

What you should be asking is whether it's time to put the ITU out to pasture. They don't necessarily need to disappear entirely. They could simply return to their roots as a standards body for infrastructure providers. Given their almost non-existent role in building the Internet as we know it today, though, it's hard to see any reason to give them control over its future.

Tuesday, December 4, 2012

Windows vs Linux vs Mac Smackdown : An Objective Comparison

Comparisons between the Microsoft Windows, Apple MAC and Linux computer operating systems have been a long running topic since the beginning of time.Comparisons of these operating systems tend to reflect their origins, historic user bases and distribution models.
We will start with a background comparison of the three operating systems.
Windows
Windows is one of the most well known operating systems developed by Microsoft. About 9 out of 10 homes and businesses currently use at least one Windows computer. Windows was originally based off of MS-DOS. This line of Windows OS became known as the 9x series. Eventually, all subsequent Windows OS’s are based off Windows NT. The most recent WINNT OS would be Windows 7.
Mac OS X
OS X is an operating system developed by Apple and is currently the 2nd most used OS after Windows. It has less than 20% marketshare. OS X unlike Windows, is actually based off of Unix. Therefore, OS X is considered to be part of the Unix OS family like Linux is. In fact, you can almost consider OS X to be a Linux Distribution.
Linux
Linux is not actually a single OS, but rather several distributions all based off of the original Unix system. Linux is very popular for servers but has recently found its way to the desktop. It’s not as popular as OS X or Windows yet, but it’s popularity is rising. Unlike OS X or Windows, Linux is free and open source. There are many distributions of Linux like Ubuntu, OpenSUSE, etc.

The Myths & Facts About These Platforms
  • Viruses
It is generally stated that PCs commonly get viruses but Linux and Macs do not. That is untrue because OS X is just as vulnerable to viruses as Windows is. The reason why Windows appear to be more vulnerable is because it literally has more viruses programmed for it. But that doesn’t mean the system itself is more vulnerable. Mac OS X seem to have no viruses targeted at it because hackers don’t think OS X is worth making viruses for. This is because barely anyone owns a Mac compared to the number of Windows users out there. Therefore, there’s not much incentive to program a virus for a Mac. Mac viruses do exist and can affect a mac system just as much as a Windows virus can affect a windows system. There is no hack – proof or virus proof system.
Even Linux systems have a few viruses too.
Truth be told, hardening techniques like the one discussed here makes the Windows platform less susceptible to viruses than it is perceived.
  • Stability
Many people say that OS X never crashes and is the most stable OS ever. The same can now be said about Windows 7.
Truth is, Windows 7 is the most stable Windows Operating systems ever. OS X can crash just as frequently as a Windows OS. In fact, OS X crashes even more when you are running non-Apple approved software such as Adobe Flash or Audacity. Even Steve Jobs admitted that Macs can crash a lot, despite what his “I’m a Mac” ads have said.
The famous BSOD (Blue Screen of Death) on Windows is a misconception from older Windows 9x systems. Back during Windows 9x series, stability was actually an issue. However, the switch to Windows NT systems made the OS much more stable and Blue Screens are considered quite rare now. Apple and other Apple fanboys are using the Windows 9x history as an argument against Windows even though those systems have long been discontinued and those problems no longer affect modern Windows systems.
As for Linux, it can crash too although it’s much more rare than WIndows or OS X. When a Linux crashes, it’s called a Kernel Panic.
  • Hardware
Normally in a Windows vs Linux comparison, hardware would not be given much mention, but because Mac OS X locks users to Apple hardware, this comparison is necessary.
For Windows and Linux, you can choose what you want to install your OS on. There are tons of options from manufacturers like Dell, HP, Acer, Gateway, Lenovo, Asus, and so on. For Mac OS X, you only have Apple.
Firstly, using the same hardware specifications, a Windows PC usually cost much less than an Apple Mac. Probably the deal breaker for most is that carrying out a hardware upgrade task on a Mac is a near impossible task, except for the real geeks. This is if you do not mind the fact that opening up a Mac will void your warranty and violate Apple’s EULA.
  • Software Library
Windows OS has the largest software library than any other OS. This means that the majority of programs, applications, and games out there are meant for Windows. Productivity Suites like Microsoft Office are always available on Windows first. That same version will be released on OS X later on. Many other programs out there are also Windows exclusive. If you’re into gaming, you’ll need good hardware (see above) and to play more games, you’ll need Windows. Many PC games today are meant for PC’s running Windows. There’s a new line of games called Games For Windows, which is obviously meant for Windows. Many Steam games are also meant for Windows. Only a few games will work on Macs or Linux.
  • Usage
There’s a common myth that Mac OS X is better and more common for video editing. This is untrue. Almost all video editing programs are multi-platform meaning that they work on both Windows and OS X. Linux may be a less favoured exception.
Programs like Sony Vegas, Adobe Premiere, Autodesk Avid, and so on all work on both OSes. The only notable video editing program that is OS X specific is Final Cut Pro. As for application and software development including game development, Windows is definitely the main platform. Many programs today are written in programming languages such as C++, C#, Java, and Visual Basic. While some of those languages work on OS X and Linux, the newer and more common ones are now for Windows only such as Visual Basic and C#. Game development for consoles and PC’s are done primarily on a Windows platform for the same reason as above.
However, it should be mentioned here that a software called Wine lets you run Windows software on other operating systems like Linux. With Wine, you can install and run these applications just like you would in Windows.
As for server use, Linux accounts for over 60% of server systems. However, Windows is also common for server usage. It depends on preference and the amount of resources someone has.
OS X also has a server edition, however it is not very popular and is rarely used mainly because of cost issues. Apple software and hardware tend to be very expensive and overpriced.
So which is the best OS? Well, that really depends on what you want. Windows is generally recommended for most users who do not have specific needs. I use Linux as my main OS but switch with Windows when there is a need for a software not supported on Linux.

linux vs windows vs mac

linux my new love

guy's, I'm so sorry that i did not post anything in the last few days but i have been crazy busy working with my new Linux OS's. and i think i'm in love. Classically i have always been a windows brat, i have only ever used it since my first win 98 PC way back when. But i just cannot believe how much easier it is to use things like MITM attacks, WEP WPA WPA2-PSK ,cracking and hotspot hacking for free access, Its just insane how easy this is on Linux compared to windows. i simply cannot believe the amount of control Linux gives compared to windows.

so far i have only used Ubuntu 12.10 and backtrack 5 r3 and honestly my favorite for everyday tasks is Ubuntu but my favorite for hacking, and cracking is backtrack for is sheer amount of per-installed tools but please drop a comment and let us know what your favorite OS is and why in the comments below

Voyager discovers 'magnetic highway'

NASA's Voyager 1 spacecraft has encountered a "magnetic highway" at the edge of the solar system, a surprising discovery 35 years after its launch, the experts behind the pioneering craft said Monday.
Earlier this year a surge in a key indicator fueled hopes that the craft was nearing the so-called heliopause, which marks the boundary between our solar system and outer space.
But instead of slipping away from the bubble of charged particles the Sun blows around itself, Voyager encountered something completely unexpected.
The craft's daily radio reports sent back evidence that the Sun's magnetic field lines was connected to interstellar magnetic fields. Lower-energy charged particles were zooming out and higher-energy particles from outside were streaming in.
They called it a magnetic highway because charged particles outside this region bounced around in all directions, as if trapped on local roads inside the bubble, or heliosphere.
"Although Voyager 1 still is inside the Sun's environment, we now can taste what it's like on the outside because the particles are zipping in and out on this magnetic highway," said Edward Stone, a Voyager project scientist based at the California Institute of Technology, Pasadena.
"We believe this is the last leg of our journey to interstellar space. Our best guess is it's likely just a few months to a couple years away. The new region isn't what we expected, but we've come to expect the unexpected from Voyager."
Voyager is now 11 billion miles (18 billion kilometers) away from the Sun, which is 122 times the distances from the Earth to the Sun. Yet it takes only 17 hours for its radio signal to reach us.
Scientists began to think it was reaching the edge of our solar system two years ago when the solar winds died down and particles settled in space the way they would in a swamp.
An increase in the number of cosmic rays in May also led them to believe Voyager had approached interstellar space.
In July the reading changed again, and by August 25 Voyager was on the magnetic highway. The number of particles from the outside jumped sharply and the number of particles from the inside fell by a factor of 1,000.
"It is as if someone opened the floodgates and they were all moved down the river, also some boaters powered up stream with close to the speed of light have been able to get in at last," said Stamatios Krimigis, Voyager's principal investigator of low-energy charged particles.
While the magnetic field is exciting, Krimigis sounded somewhat disappointed that Voyager had not yet escaped the solar system.
"Nature is very imaginative and Lucy pulled up the football again," he said, making reference to the classic comic strip Peanuts in a conference call with reporters.
The twin Voyager craft -- Voyager 2 was actually launched first, on August 20, 1977, followed by Voyager 1 on September 5 -- were designed primarily to study the biggest planets in our solar system, Jupiter and Saturn.
Taking advantage of a planetary alignment, they fulfilled that mission before pushing on to Uranus and Neptune, beaming back stunning images of the first two in 1979 and 1980, and the latter pair in 1986 and 1989.
But with those jobs complete and both craft still functioning perfectly, project managers decided to keep mining information as the devices fly further into the void.
NASA has described Voyager 1 and its companion Voyager 2 as "the two most distant active representatives of humanity and its desire to explore."
The scientists controlling Voyager 1 -- whose 1970s technology gives it just a 100,000th of the computer memory of an eight-gigabyte iPod Nano -- decided to turn off its cameras after it passed Neptune in 1989 to preserve power.
Assuming the craft continues to function normally, they will have to start turning off other on-board instruments from 2020, and it is expected to run out of power completely in 2025.

Saturday, December 1, 2012

BackTrack 5 R3 Released!

The time has come to refresh our security tool arsenal – BackTrack 5 R3 has been released. R3 focuses on bug-fixes as well as the addition of over 60 new tools – several of which were released in BlackHat and Defcon 2012. A whole new tool category was populated – “Physical Exploitation”, which now includes tools such as the Arduino IDE and libraries, as well as the Kautilya Teensy payload collection.
Building, testing and releasing a new BackTrack revision is never an easy task. Keeping up-to-date with all the latest tools, while balancing their requirements of dependencies, is akin to a magic show juggling act. Thankfully, active members of our redmine community such as backtracklover and JudasIscariot make our task that much easier by actively reporting bugs and suggesting new tools on a regular basis. Hats off to the both of you.

We would like to thank Offensive Security for providing the BackTrack dev team with the funding and resources to make all of this happen. Also, a very special thanks to dookie, our lead developer – for building, testing and packaging most of the new tools in this release.
Together with our usual KDE and GNOME, 32/64 bit ISOs, we have released a single VMware Image (Gnome, 32 bit). For those requiring other VM flavors of BackTrack – building your own VMWare image is easy – instructions can be found in the BackTrack Wiki.
Lastly, if you’re looking for intensive, real world, hands on Penetration Testing Training – make sure to drop by Offensive Security Training, and learn the meaning of “TRY HARDER“.
For the insanely impatient, you can download the BackTrack 5 R3 release via torrent right now. Direct ISO downloads will be available once all our HTTP mirrors have synched, which should take a couple more hours. Once this happens, we will update our BackTrack Download page with all links.

BackTrack Live USB Install

This method of getting a live install to a USB drive is the simplest available using Unetbootin. Note that we will format the USB drive and erase its contents.
  1. Plug in your USB Drive (Minimum USB Drive capacity 2 GB)
  2. Format the USB drive to FAT32
  3. Download Unetbootin from http://unetbootin.sourceforge.net/
  4. Start Unetbootin and select diskimage (use the backtrack-final ISO)
  5. Select your USB drive and click “OK” for creating a bootable BackTrack USB drive
  6. Log into BackTrack with the default username and password root / toor.