Welcome to TechNet Blogs Sign in | Join | Help

Troubleshooting system stability

There are two tools I find invaluable when troubleshooting system stability. The first is sigverif.exe which checks for digital signatures on system files. The second is verifier.exe which can be used to test the stability of drivers.

I only ever use Windows Hardware Quality Labs (WHQL) signed drivers on my Windows Servers, and I'd recommend you make this organisational policy for all servers (it can be enforced through group policy). Your servers are a vital resource and you just cannot afford to have untested third party code running in kernel mode - this policy will be mandatory on future versions of Windows for the same reason.

If you do have unsigned drivers, or you suspect a signed driver is causing you problems then you can use verifier.exe to monitor that driver. It will allow you to pinpoint the driver causing stability problems but at the cost of system performance whilst the tool is in use.

If I'm asked why a server is crashing the first thing I do is run sigverif and find unsigned drivers. Then I either find suitable updates that are WHQL certified or I tell the server owner to find replacement hardware that has WHQL drivers and send the old stuff back.

If neither is possible (like when I was stuck on a sailing ship in the middle of the sea with a laptop-driven GPS system that kept crashing) I turn to verifier.exe and start monitoring the drivers on the system. This helps me identify exactly which driver is causing problem.

I also check the BIOS version and see if there are updates available - you'll be amazed at some of the things that get fixed in the subsequent releases of a BIOS.

The final step is to start checking the hardware, Windows Vista includes a new Memory Diagnostic tool that will help with memory troubleshooting. Other hardware will still require special software like PC Check, but if it's not the software and not the memory maybe it's time to think about sending it back for a replacement?

It's not always possible to know exactly what's causing a system crash, the best you can do is to try and reduce the possibilities by following best practice.

Troubleshooting Stop messages: general strategies

Posted by wigunara | 0 Comments

High Definition format wars could expedite transition to electronic distribution

When the video industry decided to move away from VHS there was only ever one real standard - DVD (And no, Laserdisc doesn't count!). Now the industry buzz word is High Definition and DVDs no long cut-it. So once again, the industry is looking to move to a new standard, the problem is, there are two. High Definition-DVD (HD-DVD) and Blu-Ray are both endorsed by different maufacturers and it looks like both standards are going to be fighting for market share in 2007.

The war has already begun and HD-DVD players are on the market, and Blu-Ray is promising to go mainstream with the sales momentum of the Playstation 3.

Xbox 360 meanwhile will feature a cheap HD-DVD add-on if consumers want high definition content and don't want to pay upward of £500 for a player.

However, whilst the industry cannot agree on a physical medium they all agree that the future is electronic distribution.

I think, unless a standard emerges victorious (Round one to HD-DVD for being first to market and cheaper than Blu-Ray) then the "format wars" will simply speed up the adaoption of electronic media distribution through online video services.

Which is great news for a technology company like us :)

Start investing in companies that have a strong IP portfolio and R&D investment in electronic video distribution technologies! (*cough* Microsoft *cough*)   :)

Posted by wigunara | 0 Comments

Innovation will suffer unless we take a step back

So what really Grinds My Gears...

...Solutions to problems created by solutions to problems.

I think the IT industry is particularly bad for that, that's why I don't particularly rate anti-virus, and I'm also not keen on the trend for more and more anti-X technology. We are continuing on a dangerous exercise of creating markets for solutions to problems created by other markets we created to solve other problems, thereby the new markers have no real value add. For an industry, this is bizarre, and its costing our customers money.

We created an operating system with TCP/IP, then a whole generation of applications grew up in the pre-internet war days that were badly designed and could be exploited remotely, so we created firewalls to block TCP/IP ports, and also work-around the fact that TCP/IP was never designed to be used as much as it has, with technologies like NAT. So we invented the firewall which solves a problem in another technology, but then we need to manage this firewall, so suddenly there was a market for centralised firewall management and reporting. But hang on, because of the design of the operating system and application ecosystem we have no controls in place to prevent malicious code from running, firewall or not, so we should write an application that keeps a database of malicious code and prevents it from running (and hey preso, anti-virus was born). NAT is another technology that really shouldn't exist, its solving the problem of limited address space in TCP/IP and has no inherent value in itself, infact it reduces the services available to organisations that use it.

Then the information worker generation came about, and suddenly people had computers at home, and so they wanted to be able to work from home.

We created VPNs, which are a sledge hammer approach to a problem. Sure they allow two networks to be interconnected, but why do the networks need to be interconncted? what services require interconnections? what exactly do the services need to do that requires interconnection? and why can't this be accomplished by re-architecting the services to achieve the desired result directly? Turns out VPN was one sledge hammer too far and a market was suddenly born for VPNless remote connectivity solutions, typically wrapped up in HTTP or HTTPS because all firewalls allow port 80/HTTP, so its easier to adopt (without being any safer than if it used any other port - but that's another issue entirely). So now we have RPC-over-HTTP, thats great, but why don't we just have a new RPC mechanism that can be natively used to transverse the internet? Because if its internet-ready it will have no problem working in a LAN environment.

People will rightly argue that we can't just break everything, ruin our customer's investments and require them to retrain and redevelop thousands of systems, but havevn't we been down this route before so many times?

Someone has to have the proverbial to stand up and say "Betamax is better than VHS". Look what happened with RISC vs CISC, the Intel x86 architecture was born out of a 70's chip for calculators, it has grown (been dragged, kicking and screaming) up and up, 16-bit, then 32-bit, then 32-bit protected mode, now 64-bit, not to mention all the additional extensions that have been bolted on to compensate for the terrible x87 floating point architecture - MMX, SSE, 3DNow!, SSE2, 3DNow! Professional, SSE3, SSE4. As an industry, if we had changed architectures much earlier on - we'd have far more powerful and energy efficient computers today.

Then again, look at "blue sky" projects like Intel Itanium, which was designed to be the uber-architecture of the future. Turns out, it was too ambitious, too much design-by-commitee and will now probably not be realised for a long time (if ever) in any significant way in the marketplace. Passport is another good example, it was the first step in solving some of the fundamental problems of our industry- identity management, which could have been a powerful first step toward solving spam and breaking down security boundries. Security boundries that require VPNs and private LANs. Unified Storage (WinFS anyone?) is another good example.

It seems to me some good ideas are far and few between and too many are falling by the wayside as people produce similar technology that appears to do the same thing on the surface but is really just a modern wrapper for ancient technology.

As an industry, we are good at listening to our customers but not always so great at communicating with them. A typical arguement against change is "protecting customers investment" or "breaking application compatibility", sure those things may happen, but perhaps we should have a broader conversation with the customer about what the changes will mean for the future of computing. 

Are we driven to short-sighted solutions by our customers own short-term objectives? Afterall the shelf life of an IT manager or CIO is getting shorter and shorter, they only need to deliver short term results to get their next promotion. Do we need to take a small step back to not only help our customers of today, but our customers of tomorrow and the industry as a whole?

Also, does a bigger trend toward open-standards and interoperability compromise our ability to drive innovation? Do we risk standardising prematurely on a batch of technologies that is a set of compromises, extentions and fixes to fundamentally broken technology. Will this stifle innovation?

Technologies that have found themselves out of their depth are being wrapped up in newer ones, and each time, the stack of cards we are building is getting more and more complex, unwieldy and expensive to manage.

Customers are not challenging us enough, they take for granted that if they deploy an operating system, they'll need a firewall, anti-virus and anti-malware. If they deploy a network they'll need NAT, if they use e-mail they'll need anti-spam. As an industry we should be working together on architectures that solve these problems from the ground-up.

Ok, so I'm being a bit pessimistic I admit. As an industry we do have long term solutions to problems, using the TCP/IP example, we have IPv6 which can rid us of NAT (but will it ever be adopted?). We have web services which in theory can supercede RPC/RMI technologies, and operate across the internet and LANs. private LANs are an interesting concept in themselves, they are another quick-fix to a bigger problem - management, performance and security segmentation.  

Somewhere along the line we've lost sight of the big picture. We must rededicate our efforts from quick-fixes to long term solutions. And be brave enough to challenge ourselves to fix the root of the problem rather than add another layer of cards to an already shaky house.

The IT industry, the only question is, when will it fall?

 

 

 

Posted by wigunara | 0 Comments

Security: Recent exploits hint at a shift in focus?

Seems to me that the Malware community behaves like electricity - it will follow the path of least resistance.

With the huge drive around securing Windows and Internet Explorer delivering results, are virus writers simply going for lower hanging fruit? Applications were traditionally not as tempting a target as an operating system for several reasons, ubiquity and access to the system being the two main ones that I can see.

As we reduce the attack surface of software like Windows, SQL and Exchange we may be forcing the hand of the dark side of the force to find opportunity elsewhere.

With recent attacks focusing on members of the Office family (e.g. Excel) has the malware community shifted their focus higher up the software stack? Can we expect many more data-document delivered payloads?

Time will tell...

 

Posted by wigunara | 1 Comments

Valve try out new episode based business model with Half-Life 2

Half-Life 2 Episode One

Yesterday Valve released the first in a trilogy of episodes for Half-Life 2. The episode one will cost $20 and be available on Valve's steam platform and in shops. The trilogy will conclude by Christmas this year.

This, while exciting to Half-Life 2 fans like myself, is also a foray into a new business model for game developers and publishers. Episodic content revenue.

So how do Episodes differ to expansion packs?

I think the analogy is Movies and Sequels (Games and Expansion packs), although you could argue Half-Life 2 was the sequel to Half-Life, so perhaps expansion packs are similar to straight-to-TV movie sequels (and often as bad). Whereas Episodes are similar to a TV series.

I'm very excited about the draw that the prospect of episodic content will have, as story lines can be developed and changed by the game developer in response to community feedback, creating a closer relationship with the audience.

Fans will pay for the episodes as long as they continue to be compelling. A lot of the same rules that apply to a TV series will apply to the game episodes. In much the same way TV programs like Lost engage the audience by ending on a cliffhanger, computer game episodes can leave the gamer wondering how they'll handle the next challenge. And with games like Half-Life 2 that feature characters that the player forms relationships with as they go, Valve can also play with character development. All this will help create demand for the next episode.

If other game developers follow this model, then we may eventually see episodes for different games competing against each other for your hard-earned green.

What's not clear at this stage is whether you can skip an episode or you have to buy all of the previous episodes in order to play the next.

Valve's Steam platform is a competitive advantage to them, as it allows them to sell directly to their audience, online and cutting out the publisher. It's also valuable to the customer, who can pre-order a game on Steam. Steam will download the game in encrypted format before the release date, and then the moment the game is officially released, the decryption keys are made available through Steam, and it will allow the end-user to play the game within minutes of its release.

The pricing of episodic content, at $20 per episode is quite high, when you consider the price of a traditional "monolithic" game is $30-$40. But I don't think its unreasonably high, one episode will provide 4-6 hours of entertainment, so it's better value than the cinema or a DVD if you're talking in terms of hours of entertainment per dollar. Although it's not as social.

Blizzard will continue to dominate the online massively-multiplayer market with World of Warcraft and their monthly subscription model, so it looks like Valve will pioneer the single-player model.

The nice thing about Blizzard's revenue model is the onus is on the players to create the entertainment through interaction in the massive virtual world. Blizzard simply provides the platform. Therefore their costs should be lower.

But on the other hand, where is Blizzard really adding value?

It's really interesting to see how game companies are innovating with new business models in order to drive revenue.

I'll be interested to see how the combination of game episodes and in-game advertising will pan-out. Could gaming become far more commercial in the future? Will you sit down and order an episode of Half-Life 2 on your Sky box? Then load it up only to find the G-man has started wearing a Rolex and drinking Dr. Pepper...

What's the worst that could happen? :)

 

 

Posted by wigunara | 0 Comments

In search of the Renaissance Geek

In Europe in the 16th century - when the world was a simpler place - there was such a thing as the Renaissance Man. A person who grew up in a time when it was possible to know everything there was to know.

Today, it's impossible to know everything. But what I want to know is, is it possible to know everything within an industry e.g. Computing?

Is there somone out there who thinks they know everything, or close to everything?

What is the most one person knows?

And more importantly, is that person worth the combined value of the number of people who have only part of his knowledge or is there a diminishing return on value for knowing everything as it still does not solve the problem of time?

 

Leonardo da Vinci's Vitruvian Man
Posted by wigunara | 1 Comments

The Goal Keeper Approach to Security

In my previous blog I made the case for a paradigm shift in the anti-virus industry. Today I found an interesting article that helps support my case. The article examines whether anti-virus engines (and their creators - the anti-virus vendors) are really doing anything useful, and do they really care?

http://www.emailbattles.com/archive/battles/virus_aadddbhadc_ia/

It seems that anti-virus engines and their parent companies are failing to protect us properly, so you are left with is a good old goal keeper approach to security on which anti-virus is based. It's a black-listing technology and black-listing is just not good enough when the stakes are so high

 

The ball is the virus, the goal is your business, the keeper is your current anti-virus solution

 

Posted by wigunara | 2 Comments

Why File Systems Suck - why putting "My Computer" on the desktop was a big design mistake

The PC file system is a double-edged sword. The file systems of DOS/ Windows have allowed the PC Users to enjoy a simple and flexible store for data for a long time. And although it's designed to be a metaphor for traditional file cabinets, with folders and files, in reality, it is just as limited. 

Sure, I can use Windows Explorer to move files about, copy them over the network and delete (then recover) them. People felt windows gave them freedom to explore their PC - within the windows environment - because they could access all the data on their Hard Disk, Floppy disk or CD-ROMs. It was called Windows Explorer for a reason, don't get me wrong - I'm not trying to belittle Windows 95 by saying its shell was basically a File Manager - but it was certainly a prominent feature - and one that people liked. 

But this level of point-and-click flexibility which Microsoft really committed to with Windows 95 also haunts us today.

Here's why -

It doesn't give us a rich data experience, it allows users to play around with stuff they shouldn't need to and - it doesn't make the separation between executable code and data very well. It doesn't allow files to be "paper clipped" together in the same way they are in the real world - unless you put them in a folder. It doesn't make the separation between replaceable data (i.e. that which came from a CD-ROM or network installation) with that which is original material from the user and the file system boudary is a physical device.

Not only is user data and system data all mixed together on this "all things to all people/programs" store, but the user data can typically only be understood by the Application that was used to create it.

Still keen on the file system?

"But wait" you say - "if you want a rich data experience use why don't you use a database?"

Databases sit atop the file system and provide a standard way (SQL) to access data stored by a proprietary server (i.e. SQL Server, DB2, MySQL etc).

The schema for each table is defined by the author of that table. e.g. there is no de-facto standard for a Contact.

There is another problem - the implementation of SQL tends to vary between vendors. There is another problem still - SQL Implementations tend to be server-based, and therefore have boundaries like file systems do.

So, on the one hand we have File Systems which are too accessible, too simple but have low overhead and can store anything and on the other hand we have Databases which are too complex and too customisable, rely on servers and have high overhead.

So what's the solution?

We need to bring the two concepts together to produce a rich data experience, with the simplicity of a file system. My computer should become "My Data" - that's all I'm interested in, do I care about the fragmentation of my file system, do I care how many files are on the root of drive C: ? Do I want to be able to copy the program files folder? Do I want to trawl through folders to find the right file?

NO!

I want my data, I want other people's data and I want it in a rich way and most importantly - I want it instantly!

So how is Microsoft doing against this yardstick?

Well since Windows 98 we've been trying to sort out the mess of the file system - we added "My Documents", with Windows 2000 we modified Explorer to hide the root of drives and Windows directories by default. With XP we got rid of My Computer from the desktop.

With Windows Vista we'll introduce the first part of that rich search experience, and we'll introduce transactional support for NTFS.

Later we'll introduce WinFS which is our first attempt at bringing a database and file system together.

I have a dream, where C: is irrelevant. I never want to go into "My Computer" again - I don't want to know what the folder hierarchy of my hard drive is, and I don't want to have to wait ages to find anything.

I want to deal with people, document titles and types. If there is an obscure document written by a singe author stored on a server under the stairs in some pokey building in a far away country - I want to be able to find that document in seconds!

If I save a file to my USB memory stick and put it in my pocket - then when I search for that file on my laptop I want it to know that I put that file on that memory stick, I want it to tell me when I did and offer me an offline copy of that memory stick from when it was last inserted - (after all what else are we doing with all that free space on people's hard disks?)

If I'm in a meeting at a customer site with another colleague - and I'm trying to access a document stored at our company HQ - but I have no connectivity, I want my laptop to also query my colleagues laptop to see if his laptop has a copy of this file and access it from there.

I want data at my finger tips, across physical device boundaries and I want to be able to share it with other people seamlessly.

I wait in anticipation...

Posted by wigunara | 3 Comments

Microsoft XNA

XNA is a framework and set of tools that enables game/3D entertainment developers to write reusable code and content for both Windows and Xbox 360 - that's both exciting and a compelling business case straight away. But there's more.. 

XNA Studio builds on Visual Studio 2005 Team System and enables the roles and processes of game development to be managed in the build environment.

So with XNA technologies a game studio can produce content for an Xbox 360 and Windows PC at the same time, in a fully collaborative development environment.

"Software will be the single most important force in digital entertainment over the next decade. XNA underscores Microsoft's commitment to the game industry and our desire to work with partners to take the industry to the next level." - Bill Gates

Good stuff, I've written solutions that feature 95% cross compatible .NET Framework and .NET Compact Framework code -and I can tell, it really does cut development time down dramatically - plus it improves agility as you can easily move functionality between PDA to PC to Smartphone and vice-versa depending on the market demand. The only bits that have to be unique to each device are the interface and some kind of complex API abstraction layer, but if your building with a three tier architecture its common sense anyway :)

If this XNA stuff does similar things for game developers, then they will realise similar efficiencies.

http://www.microsoft.com/xna/

Posted by wigunara | 0 Comments

Windows Mobile, SQL Mobile 2005 and .NET Compact Framework 2.0 - a powerful combination

I want to share with you the enthusiasm I have for the Windows Mobile platform. If I had to pick an area of innovation that excited me most last year, it would have to be the developments in the Windows Mobile space. Windows and SQL Mobile 2005 in conjunction with the .NET Compact Framework 2.0 form an incredibly powerful platform for mobile information workers.

How many of you have a Windows Mobile powered device - a PDA or Smartphone?

Did you know you could get SQL Mobile for your Windows Mobile device?

Did you know that SQL Mobile can be a merge replication subscriber for data from your backend SQL Server 2000 or 2005 databases?

Now imagine what you could do with that? Any data stored in a SQL Server database could be made available to a Windows Mobile equipped workforce.

Obviously in order to do anything meaninful with that data you need a front-end application on the PDA or Smartphone, that's where the .NET Compact Framework 2.0 comes in handy.

Changes to the data on the mobile device are synchronised with SQL Server when replication is triggered and a flexible conflict resolution system is on hand should the complexity of your solution require it.

To find out more about SQL Mobile 2005 visit http://www.microsoft.com/sql/editions/sqlmobile/default.mspx

 

Posted by wigunara | 1 Comments

Windows Vista Aero Glass and GeForceFX Go 5200

If, like me, you have a laptop with a GeForceFX 5200 Go you may have already discovered that NVIDIA does not provide support for these cards in their latest Vista drivers. Fear not, because its not a big task to "workaround" this in three easy steps -

Step 1

Downoad the latest Vista drivers from NVIDIA http://www.nvidia.com

Step 2

Extract the ZIP file containing the drivers to somewhere on your local machine.

Step 3

Within extracted folder open nv_disp.inf  in notepad

Locate the section [NVIDIA.Mfg.NTx86.6.0] (32-bit Vista) or the similarly named 64-bit Vista section and add the following line at the end of the list of devices -

%NVIDIA_NV34.DEV_0324.1% = nv_NV3x,   PCI\VEN_10DE&DEV_0324

(NOTE: Owners of other unsupported NVIDIA cards can probably add support for their cards by copying the PCI\VEN information from the device in device manager)

Next, find the section titled [Strings] and add the following line -

NVIDIA_NV34.DEV_0324.1 = "NVIDIA GeForce FX Go5200"

That's it! Now save the modified file and exit notepad. Now go ahead and open device manager, find the GeForce Go device and click "Update Driver..." then point it to the location of the extracted Nvidia drivers.

 

Posted by wigunara | 1 Comments

Does the Anti-virus industry have "The wrong end of the stick"?

My previous blogs have brought up an interesting debate about white-listing vs. black-listing. To recap, white-listing is the process of explicitly listing what is good and assuming everything else is not. Black-listing is the process of explicitly listing what is bad, and assuming everything else is good.

The Group Policy article concludes that white-listing is the better approach in the case of Software Restriction Policies, and if you have developer background then the Least Privilege design principle also suggests white-listing is better.

The principle also applies "in the real world" where you require a passport before you can pass through airport security (although often countries also operate a black-list of passports they do not trust)

White-listing examples

· Passports/Driving License/ID Cards

· Security clearance for MOD/DOD

· Software restriction policies

· Credit

· Firewalls

· Operating System File Security

Black-listing examples

· Anti-virus

· Most Website filtering software

· Passports (again)

· Night-club/Bar security

· Police records

· Software restriction policies (again)

· Society in general (innocent until proven guilty)

· Countries

What I want to understand is, fundamentally, are there any criteria to consider before deciding on a white or black listing policy for any given need?

I think so, I think it comes down to

· Average trustworthiness  

· Complexity to implement a black listing policy

· Complexity to implement a white listing policy

· Consequences of misplaced trust

Take driving licenses for example

· How much would you trust the average person with a automobile?

· How difficult would it be to implement a black-listing policy for drivers who cause accidents?

· How difficult would it be to implement a white-listing policy, which requires training and a test in order to obtain a license?

· What happens if we don't train and test people before letting them drive?

My answers would be,

· not very much

·  very difficult

·  even more difficult

·  people may not be able to drive at all and may cause fatal accidents.

Given the first and last answers, driving licenses (a white-listing policy) makes sense and overcome the inherent difficulties of a white-listing policy.

Anti-virus and Website Filtering software work on the assumption its good unless the vendor has black-listed the site.

Spam filtering software is an interesting one, typically spam filtering has been a black-listing process, but as spam spiraled out of control there were a number of initiatives (some by Microsoft) to move to a white-listing system. Although ultimately the spam problem is about lack of accountability, and out of the scope of this article, most spam filtering is a mixture of black-listing and heuristics.

Perhaps there should be an anti-virus product for lockdown environments that works on a white-listing principle - for the uber security conscious.

Here is a call for comments- why don't Anti-Virus companies work in reverse and publish a list of known good software. I suggest that, by using the criteria above, there is a case for a white-listing anti-virus product -

1. Average trustworthiness of software is good

2. Difficulty to maintain a black-listing policy - black-listing is a reactive process for anti-virus. As viri come out, the anti-virus vendors must respond, they have no control over the process.

3. White-listing could be managed proactively, scheduled submission dates for testing. The AV vendors do not expose their customers to risk if they don't create a virus definition quickly enough.

4. The consequence of not catching a virus quickly enough, or failing to produce a virus definition update or the customer failing to download the update means the customer could be exposed to malicious code.

This makes a strong case for a white-listing solution, only point 1 defends the existing model.

In addition, consider the following -

· Bad guys don't want their software found. Good guys want everyone to get hold of their programs (usually for commercial reasons).

· IT administrators can simply integrate the list of known good software with their software restriction policy.

· It finally brings anti-virus in line with other white-listing IT security technologies, such as firewalls.

Anti-virus is potentially hit and miss, as there could be a delay between a virus being released into the wild and each anti-virus vendor publishing an update. This is akin to a goal-keeper in soccer (Football to us Brits), if your lucky the goal-keeper will stop the ball (virus), if not the other guys score.

I suggest the industry needs an all-encompassing solution that works using technologies from software restriction policy, anti-virus and Authenticode to provide a white-listing solution. As more and more businesses and governments run their processes on computers - the risk of malicious software running on some computer systems is just too high for a "goal-keeper" approach to defense.

 

Posted by wigunara | 1 Comments

Security and stability improvements unique to Windows Vista x64

Windows Vista x64 will mandate all kernel modules are digitally signed by Microsoft. This is unique to the x64 (AMD64/Intel EM64T) build of the software, and does not apply to 32-bit (x86) builds or IA64 (Itanium) builds. 

By doing this, vendors who write code that runs in kernel mode will need to get there code certified by Microsoft. In essence - if it's running in Kernel Mode - it's been checked by Microsoft. (UPDATE: Or the publisher has been issued with a Publisher Identity Certificate by Microsoft, which allows them to sign their own software)

Even users with administrator privileges cannot load unsigned kernel-mode code on x64-based systems. This applies for any software module that loads in kernel mode, including device drivers, filter drivers, and kernel services.

What this amounts to is both a security and stability enhancement because unsigned kernel modules will be blocked from loading (period).

Malware designed to use kernel modules and run in kernel mode will have a hard time in the x64 Vista timeframe.

This latest change, combined with enhancements that prevent kernel patching in x64 (since Windows Server 2003 SP1) will help reduce the attack surface for kernel mode malware in Windows Vista.

That's great news and another good reason to evaluate both x64 hardware and Windows Vista.

References:

Patching Policy for x64-Based Systems http://www.microsoft.com/whdc/driver/kernel/64bitPatching.mspx

Digital Signatures for Kernel Modules on x64-based Systems Running Windows Vista
http://www.microsoft.com/whdc/system/platform/64bit/kmsigning.mspx


 

Posted by wigunara | 0 Comments

Bypass Software Restriction Policies and some Group Policies

Mark Russinovich of sysinternals fame, an author I have a great deal of time for, has published a fascinating blog describing a proof-of-concept application called GPDisable which, when allowed to run, can circumvent parts of group policy - it even works from a limited user account!

In a managed environment, the IT manager is always looking for ways to ensure users cannot damage computers, breach company policy or run unlicensed software. To help prevent the use of unauthorised or dangerous software they often end up using Group Policy - Software Restriction Policies (SRPs).

SRPs are great for restricting applications, there are two main approaches to using SRPs -

  • Black-list mode is where you specify what software is NOT ALLOWED to run, everything else is ALLOWED 
  • White-list mode is where you specify what software is ALLOWED run and everything else is NOT ALLOWED. 

If your not a developer then it may not make a lot of sense, but it's important to understand that if a user can run a program like GPDisable, then the user can bypass many Group Policy settings.

"What other Group Policy settings are susceptible to this type of attack? System-wide security settings are enforced by core operating system components not accessible to limited users, but most of the settings in the Windows Components area of the Group Policy editor’s Administrative Templates node are ineffective in environments where end-users can run arbitrary applications such as Gpdisable:

Notably, Internet Explorer configuration, including Zones, fall into this area, as do Explorer, Media Player, and Messenger settings. The bottom lines is that full control of an end-user environment is possible only with strict lock-down of the programs users run, something that you can accomplish by using SRP in white-list mode, for example. It's also important to note that the ability of limited users to override these settings is not due to a bug in Windows, but rather enabled by design decisions made by the Microsoft Group Policy team."

http://www.sysinternals.com/blog/2005/12/circumventing-group-policy-as-limited.html

There is a tradeoff to be made between tight security and a flexible desktop environment. There doesn't seem to be a middle ground, either you want your desktops locked down - and you must be prepared to take whatever steps necessary - or you don't mind, you trust your users and accept that if they wanted to, they could bypass group policy settings.

I think most administrators would choose the latter for business desktops, if you are in charge of public terminals then it's probably the former - but let me know if you disagree.

I'm opening up the debate around white vs. black listing in my next blog, not just for IT security, but as a general principal.

Posted by wigunara | 0 Comments
More Posts Next page »