Korea Evangelist

Developer & Platform Evangelism, Microsoft Korea

March, 2011

  • Korea Evangelist

    {CODE PARTY} Kinect와 관련된 세션을 볼 수 있는 올해 상반기 마지막 코드파티 내일입니다. ^^

    • 0 Comments

     

     

     

     

  • Korea Evangelist

    유명 웹사이트들도 W3C Validation Service 검사 결과에서 에러가 발생하는 이유?

    • 0 Comments

    W3C 표준은 표준을 준수하는지 여부를 인증해주는 프로그램을 제공하지 않으며, 표준 단체의 표준(Standards) 용어 대신에 권장(Recommendations) 용어를 취하고 있고 표준 준수를 강제하지 않는다. W3C Validation Service  품질을 검사하거나, 표준 준수을 보장하는 용도가 아니라고 밝히고 있다.( http://validator.w3.org/about.html)

    한편, 웹표준을 준수한다는 것과 HTML 검사를 통과한다는 것은 엄밀히 말하면 다른 의미이다. HTML 검사를 통과한다는 것은 W3C 표준 명세 중에서 형식 문법(formal grammar) 정확하게 사용하고 있다는 것인데, 웹표준 준수는 기술 명세에 포함된 문법외의 다른 것들을 모두 포함하기 때문이다.

    W3C에서 제공하는 검사 서비스는 표준 문법을 정확하게 사용하였는지를 테스트하지만, 표준 문법에 따른다고 해서 모든 웹브라우저에서 똑같이 보여지는 것을 보장하지 않는다. 브라우저마다 구현 방식이 다르기 때문이다.(마이크로소프트에서 Test Case 만들어서 W3C 지원하고 Testing Center http://samples.msdn.microsoft.com/ietestcenter/ 운영하는 이유) 또한, 일부 코드의 경우 표준 명세를 엄격하게 따르는 것이 웹페이지의 용량을 증가시키기 때문에 많은 사용자들을 대상으로 하는 서비스에서는 트래픽 부담으로 작용할 있다. 사용자와 서비스 제공자에게는 현재의 웹사이트들은 W3C 검사를 엄격하게 지키는 자체보다, 모든 브라우저에서 똑같은 결과가 나오는지, 얼마나 적은 트래픽을 발생하는지 등이 중요하기 때문에 W3C Validation Service 융통성 있게 활용되고 있다.

    참고 자료

     

    Web Standards?

     

    http://en.wikipedia.org/wiki/Web_standards

     

    웹표준의 정의는 W3C에서 정의한 형식적인 표준과 기타 기술적인 스펙들에 대한 통칭. 최근에는 트렌드 워드로써, 웹사이트를 만드는 표준화된 지침이나 방법들을 포함하는 웹디자인과 개발 철학을 뜻하기도 한다.

    웹표준은 보다 넓은 범위로써 아래와 같은 것들을 포함한다고 이해할 있다.

     

    • W3C에서 발행한 Recommendations(권고안)
    • IETF에서 발행한 Internet Standard(STD) 문서
    • ISO에서 발행한 Standards(표준)
    • ECMA International에서 발행한 Standards(표준)
    • Unicode Consortium에서 발행한 The Unicode Standard(유니코드 표준) 또는 Unicode Technical Reports(UTRs)
    • IANA 운영하는 이름과 숫자의 레지스트리

     

    Common usage

    When a web site or web page is described as complying with web standards, it usually means that the site or page has valid HTML, CSS and JavaScript. The HTML should also meet accessibility and semantic guidelines.

    When web standards are discussed, the following publications are typically seen as foundational:

    • Recommendations for markup languages, such as Hypertext Markup Language (HTML), Extensible Hypertext Markup Language (XHTML), Scalable Vector Graphics (SVG), and XForms, from W3C.
    • Recommendations for stylesheets, especially Cascading Style Sheets (CSS), from W3C.
    • Standards for ECMAScript, more commonly JavaScript, from Ecma International.
    • Recommendations for Document Object Models (DOM), from W3C.
    • Properly formed names and addresses for the page and all other resources referenced from it (URIs), based upon RFC 2396, from IETF.[8]
    • Proper use of HTTP and MIME to deliver the page, return data from it and to request other resources referenced in it, based on RFC 2616, from IETF.[9]

    Web accessibility is normally based upon the Web Content Accessibility Guidelines[10] published by the W3C's Web Accessibility Initiative.

    Work in the W3C toward the Semantic Web is currently focused by publications related to the Resource Description Framework (RDF), Gleaning Resource Descriptions from Dialects of Languages (GRDDL) and Web Ontology Language (OWL).

    [edit] Standards publications and bodies

    A W3C Recommendation is a specification or set of guidelines that, after extensive consensus-building, has received the endorsement of W3C Members and the Director.

    An IETF Internet Standard is characterized by a high degree of technical maturity and by a generally held belief that the specified protocol or service provides significant benefit to the Internet community. A specification that reaches the status of Standard is assigned a number in the IETF STD series while retaining its original IETF RFC number.

     

    Pasted from <http://en.wikipedia.org/wiki/Web_standards>

     


     

     

    W3C Validator?

     

    What is Markup Validation?

    Most pages on the World Wide Web are written in computer languages (such as HTML) that allow Web authors to structure text, add multimedia content, and specify what appearance, or style, the result should have.

    As for every language, these have their own grammar, vocabulary and syntax, and every document written with these computer languages are supposed to follow these rules. The (X)HTML languages, for all versions up to XHTML 1.1, are using machine-readable grammars called DTDs, a mechanism inherited from SGML.

    However, Just as texts in a natural language can include spelling or grammar errors, documents using Markup languages may (for various reasons) not be following these rules. The process of verifying whether a document actually follows the rules for the language(s) it uses is called validation, and the tool used for that is a validator. A document that passes this process with success is called valid.

    With these concepts in mind, we can define "markup validation" as the process of checking a Web document against the grammar (generally a DTD) it claims to be using.

     

    Pasted from <http://validator.w3.org/docs/help.html>

     

     

     

    The W3C CSS validator is developed with assistance from the Mozilla Foundation, and supported by community donations.

     

    Pasted from <http://validator.w3.org/>

     


     

    Validating Web documents is an important step which can dramatically help improving and ensuring their quality, and it can save a lot of time and money (read more on why validating matters). Validation is, however, neither a full quality check, nor is it strictly equivalent to checking for conformance to the specification.

     

    Pasted from <http://validator.w3.org/about.html>

     

    Why Validate?

     

    Validation as a debugging tool

    While contemporary Web browsers do an increasingly good job of parsing even the worst HTML “tag soup”, some errors are not always caught gracefully. Very often, different software on different platforms will not handle errors in a similar fashion, making it extremely difficult to apply style or layout consistently.

    Using standard, interoperable markup and stylesheets, on the other hand, offers a much greater chance of having one's page handled consistently across platforms and user-agents. Indeed, most developers creating rich Web applications know that reliable scripting needs the document to be parsed by User-Agents without any unexpected error, and will make sure that their markup and CSS is validated before creating a rich interactive layer.

    When surveyed, a large majority of Web professionals will state that validation errors is the first thing they will check whenever they run into a Web styling or scripting bug.

    Validation as a future-proof quality check

    Checking that a page displays fine in several contemporary browsers may be a reasonable insurance that the page will “work” today, but it does not guarantee that it will work tomorrow.

    In the past, many authors who relied on the quirks of Netscape 1.1 suddenly found their pages appeared totally blank in Netscape 2.0. Whilst Internet Explorer initially set out to be bug-compatible with Netscape, it too has moved towards standards compliance in later releases.

    Validation is one of the simplest ways to check whether a page is built in accordance with Web standards, and provides one of the most reliable guarantee that future Web platforms will handle it as designed.

    Validation eases maintenance

    It is reasonable to consider that standards such as HTML and CSS are a form of “coding style” which is globally agreed upon. Creating Web pages or applications according to a widely accepted coding style makes them easier to maintain, even if the maintenance and evolution is performed by someone else.

    Validation helps teach good practices

    Many professionals have been authoring the Web with HTML and CSS for years and know these technologies by heart. Beginners and students, on the other hands, will find automated checking tools invaluable in spotting mistakes. Some teachers also stress that automated validation tests are a good introduction to broader, more complex quality concepts such as accessibility.

    Validation is a sign of professionalism

    As of today, there is little or no certification for Web professionals, and only few universities teach Web technologies, leaving most Web-smiths to learn by themselves, with varied success. Seasoned, able professionals will take pride in creating Web content using semantic and well-formed markup, separation of style and content, etc. Validation can then be used as a quick check to determine whether the code is the clean work of a seasoned HTML author, or quickly hacked-together tag soup.

     

    Pasted from <http://validator.w3.org/docs/why.html>

     

    Is validation some kind of quality control? Does "valid" mean "quality approved by W3C"?

    Validity is one of the quality criteria for a Web page, but there are many others. In other words, a valid Web page is not necessarily a good web page, but an invalid Web page has little chance of being a good web page.

    For that reason, the fact that the W3C Markup Validator says that one page passes validation does not mean that W3C assesses that it is a good page. It only means that a tool (not necessarily without flaws) has found the page to comply with a specific set of rules. No more, no less. This is also why the "valid ..." icons should never be considered as a "W3C seal of quality".

     

    Pasted from <http://validator.w3.org/docs/help.html>

     

    Ampersands (&'s) in URLs

    Another common error occurs when including a URL which contains an ampersand ("&"):

    <!-- This is invalid! -->< a href="foo.cgi?chapter=1&section=2&copy=3&lang=en">...</a>

    This example generates an error for "unknown entity section" because the "&" is assumed to begin an entity reference. Browsers often recover safely from this kind of error, but real problems do occur in some cases. In this example, many browsers correctly convert&copy=3 to©=3, which may cause the link to fail. Since &lang; is theHTMLentity for the left-pointing angle bracket, some browsers also convert&lang=en to=en. And one old browser even finds the entity&sect;, converting&section=2 to§ion=2.

    To avoid problems with both validators and browsers, always use &amp; in place of &when writing URLs inHTML:

    <a href="foo.cgi?chapter=1&amp;section=2&amp;copy=3&amp;lang=en">...</a>

    Note that replacing & with&amp; is only done when writing the URLinHTML, where "&" is a special character (along with "<" and ">"). When writing the sameURLin a plain text email message or in the location bar of your browser, you would use "&" and not "&amp;". WithHTML, the browser translates "&amp;" to "&" so the Web server would only see "&" and not "&amp;" in the query string of the request.

     

    Pasted from <http://www.htmlhelp.com/tools/validator/problems.html>

     

     

     

    Is validity the same thing as conformance?

    No, they are different concepts.

    Markup languages are defined in technical specifications, which generally include a formal grammar. A document is valid when it is correctly written in accordance to the formal grammar, whereas conformance relates to the specification itself. The two might be equivalent, but in most cases, some conformance requirements can not be expressed in the grammar, making validity only a part of the conformance.

     

    Pasted from <http://validator.w3.org/docs/help.html>

     

     

    I don't want error messages, I want you to clean up my page!

    Have a look at tools such as HTML Tidy and tidyp. When selected, the "Clean up Markup with HTML-Tidy" option will output a "cleaned" version of the input document in case it was not valid, done with HTML-Tidy, using the Markup Validator's default HTML-Tidy configuration. Note that there are no guarantees about the validity or other aspects of that output, and there are many options to configure in these tools that may result in better clean up than the Validator's default options for your document, so you may want to try out them locally.

     

    Pasted from <http://validator.w3.org/docs/help.html>

     

     

    Why doesn't the validator like my <link ... /> or <meta ... />?

    HTML is based on SGML and uses an SGML feature (called SHORTTAG) (note that this is not the case with XHTML).

    With this feature enabled, the "/" in <link ... /> or <meta ... /> already closes the link (or meta) tag, and the ">" becomes some regular text, which is not allowed in the <head> element. Since </head><body> is optional in HTML (again, not in XHTML), it is silently inserted, thus head-only elements like meta and style as well as "</head>" and "<body>", which may appear only once, become false.

    (explanation courtesy of Christoph Päper)

     

    Pasted from <http://validator.w3.org/docs/help.html>

     

     

     

    Google Home Page has 67 Validation Errors!

     

    Pasted from <http://www.webmasterworld.com/google/3244244.htm>

     

    It must be some kind of corporate decision to save on bandwidth.

    Looking at it from the figures you gave, Google's page is almost 17 times 'lighter' than MSN's. This could mean big money if they were to make it valide code.

    Although looking at their code a bit, they use font tags, inline css, and tables ... which is bad, bad, bad practice Big G!

     

    Pasted from <http://www.webmasterworld.com/google/3244244.htm>

     

     

     

    looking at the errors, most of them are unescaped ampersands. To put &amp; instead of &, when this error causes little to no problems would be silly considering the amount of traffic that google does.

     

    Pasted from <http://www.webmasterworld.com/google/3244244.htm>

     

    Additionally Wc3 is only a recommendation ;)

     

    Pasted from <http://www.webmasterworld.com/google/3244244.htm>

     

     

     

    How can we expect Webmasters to take code quality seriously if the leader in search doesn't?

    We can't and Google doesn't either. I can't direct us to the statement where Google emphasizes W3C validation for websites.

    WC3 is a very useful TOOL - especially when there are ranking issues with a website (debuging html etc) however validation is not a prerequisite of high rankings from what I see (most of our competitors return way more serious errors than that.

     

    Pasted from <http://www.webmasterworld.com/google/3244244.htm>

     

     

     

    If you put <b> into an html 4 or xhtml document, it will not validate. But it will work on every browser that anyone uses and it will work on any browser for the foreseeable future.

    On the other hand, I used a popular theme with a CMS that is xhtml compliant and all the pages validated perfectly. It looked great on firefox, mozilla and opera. In IE it produced a page, that was incredibly difficult to read.

     

    Pasted from <http://www.webmasterworld.com/google/3244244.htm>

     

     

     

    On the other hand, I used a popular theme with a CMS that is xhtml compliant and all the pages validated perfectly. It looked great on firefox, mozilla and opera. In IE it produced a page, that was incredibly difficult to read.

    Doesn't sound like an HTML/XHTML Validation issue. More of a CSS issue?

     

    Pasted from <http://www.webmasterworld.com/google/3244244.htm>

     

     

     

     

    If it looks ok - then it probably is ok (pageoneresults, you mention when it needs a fix because it doesn't look right - well google home page looks fine to me)

     

    Pasted from <http://www.webmasterworld.com/google/3244244.htm>

     

     

     

     

    OK, let's do some markup analysis. First comment is that you have to be careful what the validator is actually validating, because Google is using content-negotiation (for browser and charset) as well as IP delivery so what you see is not what you validate. Careful with page weight too, as Google uses gzip over the wire.

    So I took for my example the plain google.com homepage when logged out, as viewed in Firefox 1.5 (because that's what I'm using). There are some differences between this version and the one "seen" by the validator, for example the character encoding.

    The test version weighs in at 4617 bytes (uncompressed), and contains 63 validation errors. Because a large number of errors are repeats and doubles due to the unescaped ampersands, you can reduce this down to 30 actual errors. Break these down, and you get the following:

    a) No DOCTYPE (1 occurence)

    b) missing type attributes for script/style (3)

    c) unquoted values (15)

    d) topmargin/marginheight (1 occurence of each)

    c) unescaped ampersands (7)

    e) using name attribute on a span (1)

    f) using nowrap on a div (1)

    I took the markup and made the least number of changes possible to make the page validate. I used the doctype:

    <!DOCTYPE HTML PUBLIC "-//W3C//DTD HTML 4.01 Transitional//EN">

    as this preserves quirks mode and was required to handle the transitional markup of the page. I couldn't use HTML 4.0 Transitional (which would have saved one more byte) as this would trigger another error, and no earlier HTML version was appropriate for the markup used.

    The resulting valid version weighs in at 4911 bytes. However, it was not 100% valid - one error remains, as it would involve altering the Javascript used on the page:

    <span [b]name[/b]=more

    which is referenced in the script as:

    getElementsByName('more')

    The difference between the original invalid version and my "almost valid" one is therefore 294 bytes.

    If you remove the doctype (which was added for validation, but is not required in real use as the page is being served as quirks mode anyway), you can save 63 bytes. It is also possible to make minor changes to get rid of unnecessary markup and some line-breaks. The doctype-less and corrected version weighs in at 4767 bytes, leaving a meager difference of 150 bytes to (almost) validate the page - or a 3.1% increase in page-weight. Over the wire (rather than viewing locally), the gzip compression would reduce the 150 bytes to as little as one-fifth of that.

    This admittedly very rapid analysis was done without actually looking at the design of the page itself. There remains a glut of font elements in the markup which can be replaced by CSS and probably save enough to cover the difference. And no, it's not for older browsers that font is still used, there is plenty of other more advanced CSS on used in the page beyond the simple font-size declarations. The one remaining validation error could probably be easily fixed with a script tweak (the details are beyond this overview).

    So to summarize, you cannot in my opinion justify the invalid markup on bandwidth savings - even when you take into account the huge number of pageviews that page gets. The page appears to have been built and modified over time, and without a review of the whole page design and treating new parts differently than old ones - for example the new feature at the top right uses CSS whereas the footer uses font tags. Google would be better to take the page as a whole and review and recode, the result would probably undercut even the current version for size and could validate to boot.

     

    Pasted from <http://www.webmasterworld.com/google/3244244.htm>

     

    Recently Matt Cutts, Google software engineer, has said again that the W3C Validation does not affect search engine rankings. This is not the first time he has said this and will not be that last as many people are convinced that W3C is a factor in search engine rankings. Many people are convinced that having a clean coded website will increase search engine rankings. This is in fact untrue. A clean code does help the search engine spiders read and understand your website easier. A clean coded website will guide the spider to the important places on your site, without getting jumbled up in a web of unnecessary coding.

    The main reason that W3C Validation is not used is because Google is concerned with browser compatibilities. Because a website shows up looking perfect in Internet Explorer does not mean it will look the same in Mozilla Firefox, on mobile phones, on web TV, etc. This is a big issue for many website developers as one site can look great on one computer, and look horrible on another computer. This is Google's main issue with the W3C Validation. Just because a website passes the W3C Validation test does not mean that it will be compatible in all browsers. This is why Google does not factor this into their ranking. Another reason that Google does not validate websites is because of the time it would take to validate each individual page. Internet users want everything to happen instantaneously. They do not want to wait around for slow loading websites, when there are another million sites out there with the similar information. Therefore, Google eliminated the use of validating websites, to increase website load time and user appreciation for the speed and agility of their search engine. This makes sense, as larger websites would be slowed down by the validation process, causing unhappy web users.

    Although it seems glum that there will be any universal web standards anytime soon, we can all keep our hopes up. Universal web standards will make web designers and developers lives so much easier. As of now different web browsers will show websites differently. On Mozilla-Firefox your website may look perfect, but on Internet Explorer the website may be off centered. This causes many headaches and tiresome trial and error to perfect websites.

    There have been efforts made to create universal web standards, but these have yet to be widely adopted. Website designers and developers are at heeds with one another because of the issues caused by web compatibility. Until universal standards are adopted, there will be angry people working behind the scenes to create websites that look good in every browser. This becomes a time intensive, tedious project that can be eliminated once universal standard have been adopted.

    The major roadblocks to having universal web standards are the browsers themselves. They are not compatible with one another causing websites to look good in one browser and horrible in another. Designers and programmers are frustrated by these inconsistencies between browsers, making their jobs nearly impossible. They do not know which website standards to use when building or coding a new website.

    Matt Cutts even admits that he wishes Google did validate webpage's, but the reality is that a lot of sites on the web, even popular sites do not pass the validation. This would mean that these pages would be dropped from their rankings, or need to redesign and recode their site. Both outcomes can have daunting effects on the website. If the code needs to be changes, then the site may need to change to adjust to the new coding.

    Now you may be wondering why so many websites have a link at the bottom saying W3C Valid. They are under the impression that this adds value to the page when customers see it. The big reason to validate your website is too look for human errors that you may have overlooked when building your site. The W3C Validation will show you broken links and many other important coding factors that can negatively impact the way humans see your site.

    In conclusion, it is a good idea to have the W3C validation on your site to check for errors and clean up unnecessary coding. Who knows, in the next few months or years this may become an important ranking factor for the search engines and you can be one step ahead.

     

    Pasted from <http://www.wmtips.com/seo/w3c-validation-not-ranking-factor-google.htm>

     

     

    Validating a Website

    Validating a website is the process of ensuring that the pages on the website conform to the norms or standards defined by various organizations. Validation is important, and will ensure that your web pages are interpreted in the same way (the way you want it) by various machines, such as search engines, as well as users and visitors to your webpage.

     

    Pasted from <http://codex.wordpress.org/Validating_a_Website>

     

     

    Validation Checklist

    To help you validate your WordPress site, here is a quick checklist:

    1. Validate HTML/XHTML
    1. Validate CSS
    2. Validate for Section 508 Standards (accessibility)
    3. Validate for WAI standards (accessibility)
    4. Validate Links (check for dead links)
    1. Validate Feeds
    1. Check across different browsers (include handheld computers, Mac, PC, and cellphones, too)
    2. Re-validate HTML and CSS
    1. Have friends, relatives, co-workers check your site
    1. When ready, you can post your site on the WordPress Forum's Your WordPress for review

    HTML - Validation

    CSS - Validation

    Validation by Uploading Files

    Feeds Validation

    Validation Resources and Articles

    Related

     

    Pasted from <http://codex.wordpress.org/Validating_a_Website>

  • Korea Evangelist

    자바를 위한 윈도우 애저 이클립스 플러그인

    • 0 Comments

    윈도우 애저에서 윈도우 서버에서와 마찬가지로 자바를 구동할 수 있었는데요, 좀 더 편리하게 구동할 수 있도록 윈도우 애저의 자바에 대한 지원이 계속 되고 있습니다.

       

    얼마 전에 윈도우 애저 자바 스타터킷이 나왔고, 이 기반 위에서 이제는 프로젝트 설정 등을 마법사 형태로 쉽게 할 수 있는 자바용 윈도우 애저 이클립스 플러그인이 발표되었습니다.

       

    윈도우 애저가 갖는 클라우드 플랫폼의 장점를 자바 언어를 이용해서 활용할 수 있으며, 이는 윈도우 애저가 특정 언어에 국한 되지 않고, 오픈 클라우드의 모습을 띄는 것을 보여주는 예 입니다.

       

    보다 자세한 내용은 아래 링크를 참고하세요.

     

    새로운 자바용 윈도우 애저 이클립스 플러그인

    New plugin for Eclipse to get Java developers off the ground with Windows Azure

     

    윈도우 애저 자바 스타터 킷

    Improving experience for Java developers with Windows Azure

  • Korea Evangelist

    IE9 지원 하는 익스프레션 웹 4 서비스팩1 출시

    • 0 Comments

    마이크로소프트의 웹 페이지 저작툴인 익스프레션 웹4(Expression Web 4)의 서비스팩 1이 출시가 되었습니다. IE9 정식 출시 이후 더 관심을 받고 있는 HTML5, CSS3, 슈퍼프리뷰와 같은 웹 관련 업무를 하시는 분들께 도움이 될만한 많은 기능이 추가 되었습니다.

       

    아래는 추가된 내용들 입니다.

       

    • HTML5 지원
      • Complete support in Code Editor & Design View
    • CSS3 지원
      • New properties from CSS3 draft specifications
      • CSS Properties pane, New Style & Modify Style dialog boxes
    • 확장된 PHP 지원
      • PHP 5.3 인텔리센스 지원
      • Open as PHP, allow any file to be treated as a PHP document
    • 향상된 슈퍼프리뷰(SuperPreview)
      • IE 9 지원
    • 맥용 사파리 4 5 지원

         

    익스프레션 웹 서비스팩 1 다운로드

       

    보다 자세한 내용은 아래 링크를 클릭하세요.

    Description of Expression Web 4 Service Pack 1

  • Korea Evangelist

    IE9 64 bit와 32 bit의 자바스크립트 해석의 차이점

    • 1 Comments

    웹초보 님의 포스트에 보면 IE9 64 bit의 테스트 결과가 32bit에 비해서 떨어지는 것을 보고 IE9 64비트 버전의 엔진이 예전과 동일한 것이라고 하는데, 사실은 그렇지 않습니다.

    IE9 64bit도 32bit와 똑같은 자바스크립트 엔진인 "챠크라"를 사용하고 있습니다. 그런데 32bit의 경우 여기에 더해 스크립트 실행 전에 이를 기계어로 번역해주는 JIT 컴파일러가 추가되어 있습니다. 이 JIT 컴파일러가 64비트에 탑재되어 있지 않기 때문에 속도의 차이가 납니다. JIT 컴파일러가 없는 IE9 64bit의 경우에도 기존의 IE8에 비해 5배 이상 빠른 자바스크립트 성능을 보입니다.

    자세한 내용은 여기 를 참고하세요.

    참고로 IE9 64bit의 경우 추가기능이나 BHO, ActiveX 등이 64bit 방식에 맞게 모두 새로 작성되어야 하기 때문에 사용자에게 웹사이트 호환성을 담보하기 어렵고, 이 때문에 IE9 설치 후에 32bit 버전을 기본으로 사용하도록 하였다고 합니다.

     

Page 1 of 3 (14 items) 123