One use of the term computer security refers to technology to implement a secure operating system. Much of this technology is based on science developed in the 1980s and used to produce what may be some of the most impenetrable operating systems ever. Though still valid, the technology is in limited use today, primarily because it imposes some changes to system management and also because it is not widely understood. Such ultra-strong secure operating systems are based on operating system kernel technology that can guarantee that certain security policies are absolutely enforced in an operating environment. An example of such a Computer security policy is the Bell-La Padula model. The strategy is based on a coupling of special microprocessor hardware features, often involving the memory management unit, to a special correctly implemented operating system kernel. This forms the foundation for a secure operating system which, if certain critical parts are designed and implemented correctly, can ensure the absolute impossibility of penetration by hostile elements. This capability is enabled because the configuration not only imposes a security policy, but in theory completely protects itself from corruption. Ordinary operating systems, on the other hand, lack the features that assure this maximal level of security. The design methodology to produce such secure systems is precise, deterministic and logical.
Systems designed with such methodology represent the state of the art[clarification needed] of computer security although products using such security are not widely known. In sharp contrast to most kinds of software, they meet specifications with verifiable certainty comparable to specifications for size, weight and power. Secure operating systems designed this way are used primarily to protect national security information, military secrets, and the data of international financial institutions. These are very powerful security tools and very few secure operating systems have been certified at the highest level (Orange Book A-1) to operate over the range of "Top Secret" to "unclassified" (including Honeywell SCOMP, USAF SACDIN, NSA Blacker and Boeing MLS LAN.) The assurance of security depends not only on the soundness of the design strategy, but also on the assurance of correctness of the implementation, and therefore there are degrees of security strength defined for COMPUSEC. The Common Criteria quantifies security strength of products in terms of two components, security functionality and assurance level (such as EAL levels), and these are specified in a Protection Profile for requirements and a Security Target for product descriptions. None of these ultra-high assurance secure general purpose operating systems have been produced for decades or certified under the Common Criteria.
In USA parlance, the term High Assurance usually suggests the system has the right security functions that are implemented robustly enough to protect DoD and DoE classified information. Medium assurance suggests it can protect less valuable information, such as income tax information. Secure operating systems designed to meet medium robustness levels of security functionality and assurance have seen wider use within both government and commercial markets. Medium robust systems may provide the same security functions as high assurance secure operating systems but do so at a lower assurance level (such as Common Criteria levels EAL4 or EAL5). Lower levels mean we can be less certain that the security functions are implemented flawlessly, and therefore less dependable. These systems are found in use on web servers, guards, database servers, and management hosts and are used not only to protect the data stored on these systems but also to provide a high level of protection for network connections and routing services.
Showing posts with label Computer. Show all posts
Showing posts with label Computer. Show all posts
Tuesday, March 2, 2010
Computer security architecture
Security Architecture can be defined as the design artifacts that describe how the security controls (security countermeasures) are positioned, and how they relate to the overall information technology architecture. These controls serve the purpose to maintain the system's quality attributes, among them confidentiality, integrity, availability, accountability and assurance.". A security architecture is the plan that shows where security measures need to be placed. If the plan describes a specific solution then, prior to building such a plan, one would make a risk analysis. If the plan describes a generic high level design (reference architecture) then the plan should be based on a threat analysis.
Computer security by design
Secure by design, in software engineering, means that the software has been designed from the ground up to be secure. Malicious practices are taken for granted and care is taken to minimize impact when a security vulnerability is discovered or on invalid user input.
Generally, designs that work well do not rely on being secret[citation needed]. It is not mandatory, but proper security usually means that everyone is allowed to know and understand the design because it is secure. This has the advantage that many people are looking at the code, and this improves the odds that any flaws will be found sooner (Linus's law). Of course, attackers can also obtain the code, which makes it easier for them to find vulnerabilities as well.
Also, it is very important that everything works with the least amount of privileges possible (Least user access) . For example a Web server that runs as the administrative user (root or admin) can have the privilege to remove files and users that do not belong to itself. Thus, a flaw in that program could put the entire system at risk. On the other hand, a Web server that runs inside an isolated environment and only has the privileges for required network and filesystem functions, cannot compromise the system it runs on unless the security around it is in itself also flawed.
A perfect authentication system for logins does not allow anyone to log in at all, because the user could be a threat to the system. However, some designs can never be perfect. Passwords, biometrics, and such are never perfect.
Generally, designs that work well do not rely on being secret[citation needed]. It is not mandatory, but proper security usually means that everyone is allowed to know and understand the design because it is secure. This has the advantage that many people are looking at the code, and this improves the odds that any flaws will be found sooner (Linus's law). Of course, attackers can also obtain the code, which makes it easier for them to find vulnerabilities as well.
Also, it is very important that everything works with the least amount of privileges possible (Least user access) . For example a Web server that runs as the administrative user (root or admin) can have the privilege to remove files and users that do not belong to itself. Thus, a flaw in that program could put the entire system at risk. On the other hand, a Web server that runs inside an isolated environment and only has the privileges for required network and filesystem functions, cannot compromise the system it runs on unless the security around it is in itself also flawed.
A perfect authentication system for logins does not allow anyone to log in at all, because the user could be a threat to the system. However, some designs can never be perfect. Passwords, biometrics, and such are never perfect.
Computer security
Computer security is a branch of computer technology known as information security as applied to computers and networks. The objective of computer security includes protection of information and property from theft, corruption, or natural disaster, while allowing the information and property to remain accessible and productive to its intended users. The terms computer system security, means the collective processes and mechanisms by which sensitive and valuable information and services are protected from publication, tampering or collapse by unauthorized activities or untrustworthy individuals and unplanned events respectively.
Computer scientist
A computer scientist is a scientist who has acquired knowledge of computer science, the study of the theoretical foundations of information and computation and their application in computer systems.
Computer scientists typically work on the theoretical side of computer systems, as opposed to the hardware side that computer engineers mainly focus on (although there is overlap). Although computer scientists can also focus their work and research on specific areas (such as algorithm development and design, software engineering, information theory, database theory, computational complexity theory, human-computer interaction, computer programming, programming language theory, computer graphics and computer vision), their foundation is the theoretical study of computing from which these other fields derive. Computer scientists typically have some kind of degree in computer science.
As its name implies, computer science is a pure science, not an applied science or applied business field. As an analogy to the medical field, a computer scientist is like the cancer researcher who might study molecular biology or biochemistry in-depth, while an information technology specialist is like the physician who studies those fields at a higher level and focuses on their application to patient care.
Computer scientists can follow more practical applications of their knowledge, doing things such as software development, web development and database programming. Computer scientists can also be found in the field of information technology consulting.
Computer scientists normally get their degree in computer science at an accredited university or institution.
Computer scientists typically work on the theoretical side of computer systems, as opposed to the hardware side that computer engineers mainly focus on (although there is overlap). Although computer scientists can also focus their work and research on specific areas (such as algorithm development and design, software engineering, information theory, database theory, computational complexity theory, human-computer interaction, computer programming, programming language theory, computer graphics and computer vision), their foundation is the theoretical study of computing from which these other fields derive. Computer scientists typically have some kind of degree in computer science.
As its name implies, computer science is a pure science, not an applied science or applied business field. As an analogy to the medical field, a computer scientist is like the cancer researcher who might study molecular biology or biochemistry in-depth, while an information technology specialist is like the physician who studies those fields at a higher level and focuses on their application to patient care.
Computer scientists can follow more practical applications of their knowledge, doing things such as software development, web development and database programming. Computer scientists can also be found in the field of information technology consulting.
Computer scientists normally get their degree in computer science at an accredited university or institution.
Computer science relationship
Despite its name, a significant amount of computer science does not involve the study of computers themselves. Because of this, several alternative names have been proposed. Certain departments of major universities prefer the term computing science, to emphasize precisely that difference. Danish scientist Peter Naur suggested the term datalogy, to reflect the fact that the scientific discipline revolves around data and data treatment, while not necessarily involving computers. The first scientific institution to use the term was the Department of Datalogy at the University of Copenhagen, founded in 1969, with Peter Naur being the first professor in datalogy. The term is used mainly in the Scandinavian countries. Also, in the early days of computing, a number of terms for the practitioners of the field of computing were suggested in the Communications of the ACM – turingineer, turologist, flow-charts-man, applied meta-mathematician, and applied epistemologist. Three months later in the same journal, comptologist was suggested, followed next year by hypologist. The term computics has also been suggested. In continental Europe, names such as informatique (French), Informatik (German) or informatica (Dutch), derived from information and possibly mathematics or automatic, are more common than names derived from computer/computation.
The renowned computer scientist Edsger Dijkstra stated, "Computer science is no more about computers than astronomy is about telescopes." The design and deployment of computers and computer systems is generally considered the province of disciplines other than computer science. For example, the study of computer hardware is usually considered part of computer engineering, while the study of commercial computer systems and their deployment is often called information technology or information systems. However, there has been much cross-fertilization of ideas between the various computer-related disciplines. Computer science research has also often crossed into other disciplines, such as philosophy, cognitive science, linguistics, mathematics, physics, statistics, and economics.
Computer science is considered by some to have a much closer relationship with mathematics than many scientific disciplines, with some observers saying that computing is a mathematical science. Early computer science was strongly influenced by the work of mathematicians such as Kurt Gödel and Alan Turing, and there continues to be a useful interchange of ideas between the two fields in areas such as mathematical logic, category theory, domain theory, and algebra.
The relationship between computer science and software engineering is a contentious issue, which is further muddied by disputes over what the term "software engineering" means, and how computer science is defined. David Parnas, taking a cue from the relationship between other engineering and science disciplines, has claimed that the principal focus of computer science is studying the properties of computation in general, while the principal focus of software engineering is the design of specific computations to achieve practical goals, making the two separate but complementary disciplines.
The academic, political, and funding aspects of computer science tend to depend on whether a department formed with a mathematical emphasis or with an engineering emphasis. Computer science departments with a mathematics emphasis and with a numerical orientation consider alignment computational science. Both types of departments tend to make efforts to bridge the field educationally if not across all research.
The renowned computer scientist Edsger Dijkstra stated, "Computer science is no more about computers than astronomy is about telescopes." The design and deployment of computers and computer systems is generally considered the province of disciplines other than computer science. For example, the study of computer hardware is usually considered part of computer engineering, while the study of commercial computer systems and their deployment is often called information technology or information systems. However, there has been much cross-fertilization of ideas between the various computer-related disciplines. Computer science research has also often crossed into other disciplines, such as philosophy, cognitive science, linguistics, mathematics, physics, statistics, and economics.
Computer science is considered by some to have a much closer relationship with mathematics than many scientific disciplines, with some observers saying that computing is a mathematical science. Early computer science was strongly influenced by the work of mathematicians such as Kurt Gödel and Alan Turing, and there continues to be a useful interchange of ideas between the two fields in areas such as mathematical logic, category theory, domain theory, and algebra.
The relationship between computer science and software engineering is a contentious issue, which is further muddied by disputes over what the term "software engineering" means, and how computer science is defined. David Parnas, taking a cue from the relationship between other engineering and science disciplines, has claimed that the principal focus of computer science is studying the properties of computation in general, while the principal focus of software engineering is the design of specific computations to achieve practical goals, making the two separate but complementary disciplines.
The academic, political, and funding aspects of computer science tend to depend on whether a department formed with a mathematical emphasis or with an engineering emphasis. Computer science departments with a mathematics emphasis and with a numerical orientation consider alignment computational science. Both types of departments tend to make efforts to bridge the field educationally if not across all research.
Computer science education
Some universities teach computer science as a theoretical study of computation and algorithmic reasoning. These programs often feature the theory of computation, analysis of algorithms, formal methods, concurrency theory, databases, computer graphics and systems analysis, among others. They typically also teach computer programming, but treat it as a vessel for the support of other fields of computer science rather than a central focus of high-level study.
Other colleges and universities, as well as secondary schools and vocational programs that teach computer science, emphasize the practice of advanced programming rather than the theory of algorithms and computation in their computer science curricula. Such curricula tend to focus on those skills that are important to workers entering the software industry. The practical aspects of computer programming are often referred to as software engineering. However, there is a lot of disagreement over the meaning of the term, and whether or not it is the same thing as programming.
Other colleges and universities, as well as secondary schools and vocational programs that teach computer science, emphasize the practice of advanced programming rather than the theory of algorithms and computation in their computer science curricula. Such curricula tend to focus on those skills that are important to workers entering the software industry. The practical aspects of computer programming are often referred to as software engineering. However, there is a lot of disagreement over the meaning of the term, and whether or not it is the same thing as programming.
Computer science history
The early foundations of what would become computer science predate the invention of the modern digital computer. Machines for calculating fixed numerical tasks, such as the abacus, have existed since antiquity. Wilhelm Schickard built the first mechanical calculator in 1623. Charles Babbage designed a difference engine in Victorian times helped by Ada Lovelace. Around 1900, punch-card machines were introduced. However, all of these machines were constrained to perform a single task, or at best some subset of all possible tasks.
During the 1940s, as newer and more powerful computing machines were developed, the term computer came to refer to the machines rather than their human predecessors. As it became clear that computers could be used for more than just mathematical calculations, the field of computer science broadened to study computation in general. Computer science began to be established as a distinct academic discipline in the 1950s and early 1960s. The first computer science degree program in the United States was formed at Purdue University in 1962. Since practical computers became available, many applications of computing have become distinct areas of study in their own right.
Although many initially believed it was impossible that computers themselves could actually be a scientific field of study, in the late fifties it gradually became accepted among the greater academic population. It is the now well-known IBM brand that formed part of the computer science revolution during this time. IBM (short for International Business Machines) released the IBM 704 and later the IBM 709 computers, which were widely used during the exploration period of such devices. "Still, working with the IBM [computer] was frustrating...if you had misplaced as much as one letter in one instruction, the program would crash, and you would have to start the whole process over again". During the late 1950s, the computer science discipline was very much in its developmental stages, and such issues were commonplace.
Time has seen significant improvements in the usability and effectiveness of computer science technology. Modern society has seen a significant shift from computers being used solely by experts or professionals to a more widespread user base.
During the 1940s, as newer and more powerful computing machines were developed, the term computer came to refer to the machines rather than their human predecessors. As it became clear that computers could be used for more than just mathematical calculations, the field of computer science broadened to study computation in general. Computer science began to be established as a distinct academic discipline in the 1950s and early 1960s. The first computer science degree program in the United States was formed at Purdue University in 1962. Since practical computers became available, many applications of computing have become distinct areas of study in their own right.
Although many initially believed it was impossible that computers themselves could actually be a scientific field of study, in the late fifties it gradually became accepted among the greater academic population. It is the now well-known IBM brand that formed part of the computer science revolution during this time. IBM (short for International Business Machines) released the IBM 704 and later the IBM 709 computers, which were widely used during the exploration period of such devices. "Still, working with the IBM [computer] was frustrating...if you had misplaced as much as one letter in one instruction, the program would crash, and you would have to start the whole process over again". During the late 1950s, the computer science discipline was very much in its developmental stages, and such issues were commonplace.
Time has seen significant improvements in the usability and effectiveness of computer science technology. Modern society has seen a significant shift from computers being used solely by experts or professionals to a more widespread user base.
Computer science
Computer science or computing science (sometimes abbreviated CS) is the study of the theoretical foundations of information and computation, and of practical techniques for their implementation and application in computer systems. It is frequently described as the systematic study of algorithmic processes that create, describe, and transform information. According to Peter J. Denning, the fundamental question underlying computer science is, "What can be (efficiently) automated?" Computer science has many sub-fields; some, such as computer graphics, emphasize the computation of specific results, while others, such as computational complexity theory, study the properties of computational problems. Still others focus on the challenges in implementing computations. For example, programming language theory studies approaches to describing computations, while computer programming applies specific programming languages to solve specific computational problems, and human-computer interaction focuses on the challenges in making computers and computations useful, usable, and universally accessible to people.
The general public sometimes confuses computer science with careers that deal with computers (such as information technology), or think that it relates to their own experience of computers, which typically involves activities such as gaming, web-browsing, and word-processing. However, the focus of computer science is more on understanding the properties of the programs used to implement software such as games and web-browsers, and using that understanding to create new programs or improve existing ones.
Computer science deals with the theoretical foundations of information and computation, and of practical techniques for their implementation and application.
The general public sometimes confuses computer science with careers that deal with computers (such as information technology), or think that it relates to their own experience of computers, which typically involves activities such as gaming, web-browsing, and word-processing. However, the focus of computer science is more on understanding the properties of the programs used to implement software such as games and web-browsers, and using that understanding to create new programs or improve existing ones.
Computer science deals with the theoretical foundations of information and computation, and of practical techniques for their implementation and application.
Mac OS X Public Beta
The Mac OS X Public Beta is an early beta version of Apple Computer's Mac OS X operating system. It was released to the public on September 13, 2000 for US$29.95. It allowed software developers and early adopters to get a taste of the upcoming operating system, and develop software for the forthcoming operating system before its final release. It had a build number of 1H39.[1]
The Public Beta succeeded Mac OS X Server 1.0, the first public release of Apple's new NeXT OpenStep-based operating system, which used a variant of the classic MacOS' "Platinum" user interface look and feel. The Public Beta introduced the Aqua user interface to the world. Fundamental user interface changes were revealed with respect to fonts, the Dock, the menu bar (with an Apple logo at the center which was later repositioned, due to public request). System icons were much larger and more detailed, and new interface eye candy was prevalent.
With the Mac OS X Public Beta came fundamental technical changes, most courtesy of an open source Darwin core, including two features that Mac users had been anticipating for almost a decade: preemptive multitasking and protected memory. At the MacWorld Expo in June 2000, Apple CEO Steve Jobs demonstrated Bomb.app, a test application intended to crash.[2] The application crashed, showing a dialog indicating that other applications were not affected (since Mac OS X had memory protection). A cheer arose from the crowd, as the older Mac OS dialog recommended a complete system restart after such an event.[citation needed]
Some features were not working or missing with this release - there was no support for web-based Java applets.[citation needed] There was no support for printing, Carbon was in a very incomplete state, and Classic applications couldn't access the network.[citation needed] This last was a particular handicap, as the only native web browsers then available were a beta version of Microsoft Internet Explorer and OmniGroup's OmniWeb, a holdover from the NeXT platform. Users of Netscape or Mozilla browsers had to wait.
Native applications in general were few and far between. Users had to turn to open source or shareware alternatives, giving rise to an active homebrew software community around the new operating system.
Apple used user feedback to incorporate improvements in the retail version that was to follow.[citation needed]
Mac OS X Public Beta expired and ceased to function in Spring 2001.[3]
Mac OS X v10.0 was the first completed release of Mac OS X. It became available in March 2001. Owners of the Public Beta version were entitled to a $29.95 discount on the price of the first full version of Mac OS X 10.0.
The Public Beta succeeded Mac OS X Server 1.0, the first public release of Apple's new NeXT OpenStep-based operating system, which used a variant of the classic MacOS' "Platinum" user interface look and feel. The Public Beta introduced the Aqua user interface to the world. Fundamental user interface changes were revealed with respect to fonts, the Dock, the menu bar (with an Apple logo at the center which was later repositioned, due to public request). System icons were much larger and more detailed, and new interface eye candy was prevalent.
With the Mac OS X Public Beta came fundamental technical changes, most courtesy of an open source Darwin core, including two features that Mac users had been anticipating for almost a decade: preemptive multitasking and protected memory. At the MacWorld Expo in June 2000, Apple CEO Steve Jobs demonstrated Bomb.app, a test application intended to crash.[2] The application crashed, showing a dialog indicating that other applications were not affected (since Mac OS X had memory protection). A cheer arose from the crowd, as the older Mac OS dialog recommended a complete system restart after such an event.[citation needed]
Some features were not working or missing with this release - there was no support for web-based Java applets.[citation needed] There was no support for printing, Carbon was in a very incomplete state, and Classic applications couldn't access the network.[citation needed] This last was a particular handicap, as the only native web browsers then available were a beta version of Microsoft Internet Explorer and OmniGroup's OmniWeb, a holdover from the NeXT platform. Users of Netscape or Mozilla browsers had to wait.
Native applications in general were few and far between. Users had to turn to open source or shareware alternatives, giving rise to an active homebrew software community around the new operating system.
Apple used user feedback to incorporate improvements in the retail version that was to follow.[citation needed]
Mac OS X Public Beta expired and ceased to function in Spring 2001.[3]
Mac OS X v10.0 was the first completed release of Mac OS X. It became available in March 2001. Owners of the Public Beta version were entitled to a $29.95 discount on the price of the first full version of Mac OS X 10.0.
Mac Version

With the exception of Mac OS X Server 1.0 and the original public beta, Mac OS X versions are named after big cats. Prior to its release, version 10.0 was code named "Cheetah" internally at Apple, and version 10.1 was code named internally as "Puma". After the immense buzz surrounding version 10.2, codenamed "Jaguar", Apple's product marketing began openly using the code names to promote the operating system. 10.3 was marketed as "Panther", 10.4 as "Tiger", and 10.5 as "Leopard". "Snow Leopard" is the name for the current release, version 10.6. "Panther", "Tiger" and "Leopard" are registered as trademarks of Apple, but "Cheetah", "Puma" and "Jaguar" have never been registered. Apple has also registered "Lynx" and "Cougar" as trademarks, though these were allowed to lapse. Computer retailer Tiger Direct sued Apple for its use of the name "Tiger". On May 16, 2005 a US federal court in the Southern District of Florida ruled that Apple's use does not infringe on Tiger Direct's trademark.
Mac Features

When a widget is added to the dashboard, it appears with a ripple effect.
One of the major differences between the previous versions of Mac OS and OS X was the addition of the Aqua GUI, a graphical user interface with water-like elements. Every window element, text, graphics, or widgets is drawn on-screen using the anti-aliasing technology. ColorSync, a technology introduced many years before, was improved and built into the core drawing engine, to provide color matching for printing and multimedia professionals. Also, drop shadows were added around windows and isolated text elements to provide a sense of depth. New interface elements were integrated, including sheets (document modal dialog boxes attached to specific windows) and drawers.
Apple has continued to change aspects of the OS X appearance and design, particularly with tweaks to the appearance of windows and the menu bar. One example of a UI behavioral change is that previewed video and audio files no longer have progress bars in column view; instead, they have mouse-over start and stop buttons as of 10.5.
The human interface guidelines published by Apple for Mac OS X are followed by many applications, giving them consistent user interface and keyboard shortcuts. In addition, new services for applications are included, which include spelling and grammar checkers, special characters palette, color picker, font chooser and dictionary; these global features are present in every Cocoa application, adding consistency. The graphics system OpenGL composites windows onto the screen to allow hardware-accelerated drawing. This technology, introduced in version 10.2, is called Quartz Extreme, a component of Quartz. Quartz's internal imaging model correlates well with the Portable Document Format (PDF) imaging model, making it easy to output PDF to multiple devices. As a side result, PDF viewing is a built-in feature.
In version 10.3, Apple added Exposé, a feature which includes three functions to help accessibility between windows and desktop. Its functions are to instantly display all open windows as thumbnails for easy navigation to different tasks, display all open windows as thumbnails from the current application, and hide all windows to access the desktop. Also, FileVault was introduced, which is an optional encryption of the user's files with Advanced Encryption Standard (AES-128).
Features introduced in version 10.4 include Automator, an application designed to create an automatic workflow for different tasks; Dashboard, a full-screen group of small applications called desktop widgets that can be called up and dismissed in one keystroke; and Front Row, a media viewer interface accessed by the Apple Remote. Moreover, the Sync Services were included, which is a system that allows applications to access a centralized extensible database for various elements of user data, including calendar and contact items. The operating system then managed conflicting edits and data consistency.
As of version 10.5, all system icons are scalable up to 512×512 pixels, to accommodate various places where they appear in larger size, including for example the Cover Flow view, a three-dimensional graphical user interface included with iTunes, the Finder, and other Apple products for visually skimming through files and digital media libraries via cover artwork. This version includes Spaces, a virtual desktop implementation which enables the user to have more than one desktop and display them in an Exposé-like interface. Mac OS X v10.5 includes an automatic backup technology called Time Machine, which provides the ability to view and restore previous versions of files and application data; and Screen Sharing was built in for the first time.
Finder is a file browser allowing quick access to all areas of the computer, which has been modified throughout subsequent releases of Mac OS X. Quick Look is part of Mac OS X Leopard's Finder. It allows for dynamic previews of files, including videos and multi-page documents, without opening their parent applications. Spotlight search technology, which is integrated into the Finder since Mac OS X Tiger, allows rapid real-time searches of data files; mail messages; photos; and other information based on item properties (meta data) and/or content. Mac OS X makes use of a Dock, which holds file and folder shortcuts as well as minimized windows. Mac OS X Architecture implements a layered framework. The layered framework aids rapid development of applications by providing existing code for common tasks.
Apple - Intel Transition
In April 2002, eWeek reported a rumor that Apple had a version of Mac OS X code-named Marklar which ran on Intel x86 processors. The idea behind Marklar was to keep Mac OS X running on an alternative platform should Apple become dissatisfied with the progress of the PowerPC platform. These rumors subsided until late in May 2005, when various media outlets, such as the Wall Street Journal and CNET, reported that Apple would unveil Marklar in the coming months.
On June 6, 2005, Steve Jobs confirmed these rumors when he announced in his keynote address at the annual Apple Worldwide Developers Conference that Apple would be making the transition from PowerPC to Intel processors over the following two years, and that Mac OS X would support both platforms during the transition. Jobs also confirmed rumors that Apple has had versions of Mac OS X running on Intel processors for most of its developmental life. The last time that Apple switched CPU families—from the Motorola 68K CPU to the IBM/Motorola PowerPC—Apple included a Motorola 68K emulator in the new OS that made almost all 68K software work automatically on the new hardware. Apple had supported the 68K emulator for 11 years, but stopped supporting it during the transition to Intel CPUs. Included in the new OS for the Intel-based Macs is Rosetta, a binary translation layer which enables software compiled for PowerPC Mac OS X to run on Intel Mac OS X machines. However, Apple dropped support for Classic mode on the new Intel Macs. Third party emulation software such as Mini vMac, Basilisk II and SheepShaver provides support for some early versions of Mac OS. A new version of Xcode and the underlying command-line compilers support building universal binaries that will run on either architecture.
Software that is available only for PowerPC is supported with Rosetta, though applications may have to be rewritten to run properly on the newer OS X for Intel. Apple encourages developers to produce universal binaries with support for both PowerPC and x86. There is a performance penalty when PowerPC binaries run on Intel Macs through Rosetta. Moreover, some PowerPC software, such as kernel extensions and System Preferences plugins, are not supported on Intel Macs. Some PowerPC applications would not run on Intel OS X at all. Plugins for Safari need to be compiled for the same platform as Safari, so when Safari is running on Intel it requires plug-ins that have been compiled as Intel-only or universal binaries, so PowerPC-only plug-ins will not work. While Intel Macs will be able to run PowerPC, x86, and universal binaries, PowerPC Macs will support only universal and PowerPC builds.
Support for the PowerPC platform remains in Mac OS X version 10.5. Such cross-platform capability already existed in Mac OS X's lineage; Openstep was ported to many architectures, including x86, and Darwin included support for both PowerPC and x86. Although Apple stated that Mac OS X would not run on Intel-based personal computers aside from its own, a hacked version of the OS compatible with conventional x86 hardware has been developed by the OSx86 community.
On June 8, 2009, Apple announced at its Worldwide Developers Conference that Snow Leopard (version 10.6) would drop support for PowerPC processors and be Intel-only. However, Rosetta is still supported. In Snow Leopard, Rosetta is not installed by default, but it is available on the installation DVD as an installable add-on.
On June 6, 2005, Steve Jobs confirmed these rumors when he announced in his keynote address at the annual Apple Worldwide Developers Conference that Apple would be making the transition from PowerPC to Intel processors over the following two years, and that Mac OS X would support both platforms during the transition. Jobs also confirmed rumors that Apple has had versions of Mac OS X running on Intel processors for most of its developmental life. The last time that Apple switched CPU families—from the Motorola 68K CPU to the IBM/Motorola PowerPC—Apple included a Motorola 68K emulator in the new OS that made almost all 68K software work automatically on the new hardware. Apple had supported the 68K emulator for 11 years, but stopped supporting it during the transition to Intel CPUs. Included in the new OS for the Intel-based Macs is Rosetta, a binary translation layer which enables software compiled for PowerPC Mac OS X to run on Intel Mac OS X machines. However, Apple dropped support for Classic mode on the new Intel Macs. Third party emulation software such as Mini vMac, Basilisk II and SheepShaver provides support for some early versions of Mac OS. A new version of Xcode and the underlying command-line compilers support building universal binaries that will run on either architecture.
Software that is available only for PowerPC is supported with Rosetta, though applications may have to be rewritten to run properly on the newer OS X for Intel. Apple encourages developers to produce universal binaries with support for both PowerPC and x86. There is a performance penalty when PowerPC binaries run on Intel Macs through Rosetta. Moreover, some PowerPC software, such as kernel extensions and System Preferences plugins, are not supported on Intel Macs. Some PowerPC applications would not run on Intel OS X at all. Plugins for Safari need to be compiled for the same platform as Safari, so when Safari is running on Intel it requires plug-ins that have been compiled as Intel-only or universal binaries, so PowerPC-only plug-ins will not work. While Intel Macs will be able to run PowerPC, x86, and universal binaries, PowerPC Macs will support only universal and PowerPC builds.
Support for the PowerPC platform remains in Mac OS X version 10.5. Such cross-platform capability already existed in Mac OS X's lineage; Openstep was ported to many architectures, including x86, and Darwin included support for both PowerPC and x86. Although Apple stated that Mac OS X would not run on Intel-based personal computers aside from its own, a hacked version of the OS compatible with conventional x86 hardware has been developed by the OSx86 community.
On June 8, 2009, Apple announced at its Worldwide Developers Conference that Snow Leopard (version 10.6) would drop support for PowerPC processors and be Intel-only. However, Rosetta is still supported. In Snow Leopard, Rosetta is not installed by default, but it is available on the installation DVD as an installable add-on.
Mac Hardware
For the early releases of Mac OS X, the standard hardware platform supported was the full line of Macintosh computers (laptop, desktop, or server) based on PowerPC G3, G4, and G5 processors. Later versions discontinued support for some older hardware; for example, Panther does not support "beige" G3s[35], and Tiger does not support systems that pre-date Apple's introduction of integrated FireWire ports (however the ports themselves are not a functional requirement). Mac OS X v10.5 "Leopard", introduced October 2007, has dropped support for all PowerPC G3 processors and for PowerPC G4 processors with clock rates below 867 MHz. Mac OS X v10.6 "Snow Leopard" supports only Macs with Intel processors, not PowerPC.
Tools such as XPostFacto and patches applied to the installation disc have been developed by third parties to enable installation of newer versions of Mac OS X on systems not officially supported by Apple. This includes a number of pre-G3 Power Macintosh systems that can be made to run up to and including Mac OS X 10.2 Jaguar, all G3-based Macs which can run up to and including Tiger, and sub-867 MHz G4 Macs can run Leopard by removing the restriction from the installation DVD or entering a command in the Mac's Open Firmware interface to tell the Leopard Installer that it has a clock rate of 867 MHz or greater. Except for features requiring specific hardware (e.g. graphics acceleration, DVD writing), the operating system offers the same functionality on all supported hardware.
PowerPC versions of Mac OS X prior to Leopard retain compatibility with older Mac OS applications by providing an emulation environment called Classic, which allows users to run Mac OS 9 as a process within Mac OS X, so that most older applications run as they would under the older operating system. Classic is not supported on Intel-based Macs or in Mac OS X v10.5 "Leopard", although users still requiring Classic applications on Intel Macs can use the SheepShaver emulator to run Mac OS 9 on top of Leopard.
Tools such as XPostFacto and patches applied to the installation disc have been developed by third parties to enable installation of newer versions of Mac OS X on systems not officially supported by Apple. This includes a number of pre-G3 Power Macintosh systems that can be made to run up to and including Mac OS X 10.2 Jaguar, all G3-based Macs which can run up to and including Tiger, and sub-867 MHz G4 Macs can run Leopard by removing the restriction from the installation DVD or entering a command in the Mac's Open Firmware interface to tell the Leopard Installer that it has a clock rate of 867 MHz or greater. Except for features requiring specific hardware (e.g. graphics acceleration, DVD writing), the operating system offers the same functionality on all supported hardware.
PowerPC versions of Mac OS X prior to Leopard retain compatibility with older Mac OS applications by providing an emulation environment called Classic, which allows users to run Mac OS 9 as a process within Mac OS X, so that most older applications run as they would under the older operating system. Classic is not supported on Intel-based Macs or in Mac OS X v10.5 "Leopard", although users still requiring Classic applications on Intel Macs can use the SheepShaver emulator to run Mac OS 9 on top of Leopard.
Mac Software
he APIs that Mac OS X inherited from OpenStep are not backward compatible with earlier versions of Mac OS. These APIs were created as the result of a 1993 collaboration between NeXT Computer and Sun Microsystems and are now referred to by Apple as Cocoa. This heritage is highly visible for Cocoa developers, since the "NS" prefix is ubiquitous in the framework, standing variously for Nextstep or NeXT/Sun. The official OpenStep API, published in September 1994, was the first to split the API between Foundation and Application Kit and the first to use the "NS" prefix. Apple's Rhapsody project would have required all new development to use these APIs, causing much outcry among existing Mac developers. All Mac software that did not receive a complete rewrite to the new framework would run in the equivalent of the Classic environment. To permit a smooth transition from Mac OS 9 to Mac OS X, the Carbon Application Programming Interface (API) was created. Applications written with Carbon can be executed natively on both systems. Carbon was not included in the first product sold as Mac OS X, Mac OS X Server (now known as Mac OS X Server 1.x).
Mac OS X used to support the Java Platform as a "preferred software package"-in practice this means that applications written in Java fit as neatly into the operating system as possible while still being cross-platform compatible, and that graphical user interfaces written in Swing look almost exactly like native Cocoa interfaces. Traditionally, Cocoa programs have been mostly written in Objective-C, with Java as an alternative. However, on July 11, 2005, Apple announced that "features added to Cocoa in Mac OS X versions later than 10.4 will not be added to the Cocoa-Java programming interface."
Since Mac OS X is POSIX compliant, many software packages written for the * BSDs or Linux can be recompiled to run on it. Projects such as Fink, MacPorts and pkgsrc provide pre-compiled or pre-formatted packages. Since version 10.3, Mac OS X has included X11.app, Apple's version of the X Window System graphical interface for Unix applications, as an optional component during installation. Up to and including Mac OS X v10.4 (Tiger), Apple's implementation was based on the X11 Licensed XFree86 4.3 and X11R6.6. All bundled versions of X11 feature a window manager which is similar to the Mac OS X look-and-feel and has fairly good integration with Mac OS X, also using the native Quartz rendering system. Earlier versions of Mac OS X (in which X11 has not been bundled) can also run X11 applications using XDarwin. With the introduction of version 10.5 Apple switched to the X.org variant of X11.
Mac OS X used to support the Java Platform as a "preferred software package"-in practice this means that applications written in Java fit as neatly into the operating system as possible while still being cross-platform compatible, and that graphical user interfaces written in Swing look almost exactly like native Cocoa interfaces. Traditionally, Cocoa programs have been mostly written in Objective-C, with Java as an alternative. However, on July 11, 2005, Apple announced that "features added to Cocoa in Mac OS X versions later than 10.4 will not be added to the Cocoa-Java programming interface."
Since Mac OS X is POSIX compliant, many software packages written for the * BSDs or Linux can be recompiled to run on it. Projects such as Fink, MacPorts and pkgsrc provide pre-compiled or pre-formatted packages. Since version 10.3, Mac OS X has included X11.app, Apple's version of the X Window System graphical interface for Unix applications, as an optional component during installation. Up to and including Mac OS X v10.4 (Tiger), Apple's implementation was based on the X11 Licensed XFree86 4.3 and X11R6.6. All bundled versions of X11 feature a window manager which is similar to the Mac OS X look-and-feel and has fairly good integration with Mac OS X, also using the native Quartz rendering system. Earlier versions of Mac OS X (in which X11 has not been bundled) can also run X11 applications using XDarwin. With the introduction of version 10.5 Apple switched to the X.org variant of X11.
Mac OS X Description
Mac OS X is the tenth major version of Apple's operating system for Macintosh computers. Previous Macintosh operating systems were named using Arabic numerals, e.g. Mac OS 8 and Mac OS 9. The letter X in Mac OS X's name refers to the number 10, a Roman numeral. It is therefore correctly pronounced "ten" /ˈtɛn/ in this context, though "X" /ˈɛks/ is also a common pronunciation.
Mac OS X's core is a POSIX compliant operating system (OS) built on top of the XNU kernel, with standard Unix facilities available from the command line interface. Apple released this family of software as a free and open source operating system named Darwin, but it later became partially proprietary. On top of Darwin, Apple layered a number of components, including the Aqua interface and the Finder, to complete the GUI-based operating system which is Mac OS X.
Mac OS X introduced a number of new capabilities to provide a more stable and reliable platform than its predecessor, Mac OS 9. For example, pre-emptive multitasking and memory protection improved the system's ability to run multiple applications simultaneously without them interrupting or corrupting each other. Many aspects of Mac OS X's architecture are derived from Openstep, which was designed to be portable, to ease the transition from one platform to another. For example, Nextstep was ported from the original 68k-based NeXT workstations to x86 and other architectures before NeXT was purchased by Apple, and OpenStep was later ported to the PowerPC architecture as part of the Rhapsody project.
The most visible change was the Aqua theme. The use of soft edges, translucent colors, and pinstripes – similar to the hardware design of the first iMacs – brought more texture and color to the user interface when compared to what OS 9 and OS X Server 1.0's "Platinum" appearance had offered. According to John Siracusa, an editor of Ars Technica, the introduction of Aqua and its departure from the then conventional look "hit like a ton of bricks." However Bruce Tognazzini (who founded the original Apple Human Interface Group) said that the Aqua interface in Mac OS X v10.0 represented a step backwards in usability compared with the original Mac OS interface. Despite the controversial new interface, third-party developers started producing skins for customizable applications for Mac and other operating systems which mimicked the Aqua appearance. To some extent, Apple has used the successful transition to this new design as leverage, at various times threatening legal action against people who make or distribute software with an interface the company claims is derived from its copyrighted design.
Mac OS X Architecture implements a layered framework. The layered framework aids rapid development of applications by providing existing code for common tasks.
Mac OS X includes its own software development tools, most prominently an integrated development environment called Xcode. Xcode provides interfaces to compilers that support several programming languages including C, C++, Objective-C, and Java. For the Apple–Intel transition, it was modified so that developers could build their applications as a universal binary, which provides compatibility with both the Intel-based and PowerPC-based Macintosh lines.
The Darwin sub-system in Mac OS X is in charge of managing the filesystem, which includes the Unix permissions layer. In 2003 and 2005, two Macworld editors expressed criticism of the permission scheme; Ted Landau called misconfigured permissions "the most common frustration" in Mac OS X, while Rob Griffiths suggested that some users may even have to reset permissions every day, a process which can take up to 15 minutes. More recently, another Macworld editor, Dan Frakes, called the procedure of repairing permissions vastly overused. He argues that Mac OS X typically handles permissions properly without user interference, and resetting permissions should be tried only when problems emerge.
As of 2009, Mac OS X is the second most popular general-purpose operating system in use for the internet, after Microsoft Windows, with a 4.5% market share according to statistics compiled by Net Applications. In contrast, it is the most successful UNIX-like desktop operating system on the internet, estimated at over 4 times the penetration of the free Linux. Mac OS X is available in a variety of languages, including English, Japanese, French, German, Spanish, Portuguese and Italian.
Mac OS X's core is a POSIX compliant operating system (OS) built on top of the XNU kernel, with standard Unix facilities available from the command line interface. Apple released this family of software as a free and open source operating system named Darwin, but it later became partially proprietary. On top of Darwin, Apple layered a number of components, including the Aqua interface and the Finder, to complete the GUI-based operating system which is Mac OS X.
Mac OS X introduced a number of new capabilities to provide a more stable and reliable platform than its predecessor, Mac OS 9. For example, pre-emptive multitasking and memory protection improved the system's ability to run multiple applications simultaneously without them interrupting or corrupting each other. Many aspects of Mac OS X's architecture are derived from Openstep, which was designed to be portable, to ease the transition from one platform to another. For example, Nextstep was ported from the original 68k-based NeXT workstations to x86 and other architectures before NeXT was purchased by Apple, and OpenStep was later ported to the PowerPC architecture as part of the Rhapsody project.
The most visible change was the Aqua theme. The use of soft edges, translucent colors, and pinstripes – similar to the hardware design of the first iMacs – brought more texture and color to the user interface when compared to what OS 9 and OS X Server 1.0's "Platinum" appearance had offered. According to John Siracusa, an editor of Ars Technica, the introduction of Aqua and its departure from the then conventional look "hit like a ton of bricks." However Bruce Tognazzini (who founded the original Apple Human Interface Group) said that the Aqua interface in Mac OS X v10.0 represented a step backwards in usability compared with the original Mac OS interface. Despite the controversial new interface, third-party developers started producing skins for customizable applications for Mac and other operating systems which mimicked the Aqua appearance. To some extent, Apple has used the successful transition to this new design as leverage, at various times threatening legal action against people who make or distribute software with an interface the company claims is derived from its copyrighted design.
Mac OS X Architecture implements a layered framework. The layered framework aids rapid development of applications by providing existing code for common tasks.
Mac OS X includes its own software development tools, most prominently an integrated development environment called Xcode. Xcode provides interfaces to compilers that support several programming languages including C, C++, Objective-C, and Java. For the Apple–Intel transition, it was modified so that developers could build their applications as a universal binary, which provides compatibility with both the Intel-based and PowerPC-based Macintosh lines.
The Darwin sub-system in Mac OS X is in charge of managing the filesystem, which includes the Unix permissions layer. In 2003 and 2005, two Macworld editors expressed criticism of the permission scheme; Ted Landau called misconfigured permissions "the most common frustration" in Mac OS X, while Rob Griffiths suggested that some users may even have to reset permissions every day, a process which can take up to 15 minutes. More recently, another Macworld editor, Dan Frakes, called the procedure of repairing permissions vastly overused. He argues that Mac OS X typically handles permissions properly without user interference, and resetting permissions should be tried only when problems emerge.
As of 2009, Mac OS X is the second most popular general-purpose operating system in use for the internet, after Microsoft Windows, with a 4.5% market share according to statistics compiled by Net Applications. In contrast, it is the most successful UNIX-like desktop operating system on the internet, estimated at over 4 times the penetration of the free Linux. Mac OS X is available in a variety of languages, including English, Japanese, French, German, Spanish, Portuguese and Italian.
Mac History
Mac OS X is based upon the Mach kernel. Certain parts from FreeBSD's and NetBSD's implementation of Unix were incorporated in Nextstep, the core of Mac OS X. Nextstep was the object-oriented operating system developed by Steve Jobs' company NeXT after he left Apple in 1985. While Jobs was away from Apple, Apple tried to create a "next-generation" OS through the Taligent, Copland and Gershwin projects, with little success.
Eventually, NeXT's OS, then called OPENSTEP, was selected to be the basis for Apple's next OS, and Apple purchased NeXT outright. Steve Jobs returned to Apple as interim CEO, and later became CEO, shepherding the transformation of the programmer-friendly OPENSTEP into a system that would be adopted by Apple's primary market of home users and creative professionals. The project was first known as Rhapsody and was later renamed to Mac OS X.
Mac OS X Server 1.x, was incompatible with software designed for the original Mac OS and had no support for Apple's own IEEE 1394 interface (FireWire). Mac OS X 10.x included more backward compatibility and functionality by including the Carbon API as well as FireWire support. As the operating system evolved, it moved away from the legacy Mac OS to an emphasis on new "digital lifestyle" applications such as the iLife suite, enhanced business applications (iWork), and integrated home entertainment (the Front Row media center). Each version also included modifications to the general interface, such as the brushed metal appearance added in version 10.3, the non-pinstriped titlebar appearance in version 10.4, and in 10.5 the removal of the previous brushed metal styles in favor of the "Unified" gradient window style.
Eventually, NeXT's OS, then called OPENSTEP, was selected to be the basis for Apple's next OS, and Apple purchased NeXT outright. Steve Jobs returned to Apple as interim CEO, and later became CEO, shepherding the transformation of the programmer-friendly OPENSTEP into a system that would be adopted by Apple's primary market of home users and creative professionals. The project was first known as Rhapsody and was later renamed to Mac OS X.
Mac OS X Server 1.x, was incompatible with software designed for the original Mac OS and had no support for Apple's own IEEE 1394 interface (FireWire). Mac OS X 10.x included more backward compatibility and functionality by including the Carbon API as well as FireWire support. As the operating system evolved, it moved away from the legacy Mac OS to an emphasis on new "digital lifestyle" applications such as the iLife suite, enhanced business applications (iWork), and integrated home entertainment (the Front Row media center). Each version also included modifications to the general interface, such as the brushed metal appearance added in version 10.3, the non-pinstriped titlebar appearance in version 10.4, and in 10.5 the removal of the previous brushed metal styles in favor of the "Unified" gradient window style.
Mac OS X
Mac OS X (pronounced /mæk oʊ ɛs tɛn/) is an operating system developed, marketed, and sold by Apple Inc., and since 2002 has been included with all new Macintosh computer systems. It is the successor to Mac OS 9, the final release of the "classic" Mac OS, which had been Apple's primary operating system since 1984.
Mac OS X, whose "X" represents the Roman numeral for "10" and is a prominent part of its brand identity, is a Unix-based operating system, built on technologies developed at NeXT between the second half of the 1980s and Apple's purchase of the company in late 1996. Its sixth release Mac OS X v10.5 "Leopard" gained UNIX 03 certification while running on Intel processors.[1]
The first version released was Mac OS X Server 1.0 in 1999, and a desktop-oriented version, Mac OS X v10.0 "Cheetah" followed on March 24, 2001. Releases of Mac OS X are named after big cats: for example, Mac OS X v10.6 is usually referred to by Apple and users as "Snow Leopard". The server edition, Mac OS X Server, is architecturally identical to its desktop counterpart, and includes tools to facilitate management of workgroups of Mac OS X machines, and to provide access to network services. These tools include a mail transfer agent, a Samba server, an LDAP server, a domain name server, and others. It is pre-loaded on Apple's Xserve server hardware, but can be run on almost all of Apple's current selling computer models.
Apple also produces specialized versions of Mac OS X for use on four of its consumer devices: the iPhone OS for the iPhone, iPod Touch, and iPad, as well as an unnamed version for the Apple TV.
Mac OS X, whose "X" represents the Roman numeral for "10" and is a prominent part of its brand identity, is a Unix-based operating system, built on technologies developed at NeXT between the second half of the 1980s and Apple's purchase of the company in late 1996. Its sixth release Mac OS X v10.5 "Leopard" gained UNIX 03 certification while running on Intel processors.[1]
The first version released was Mac OS X Server 1.0 in 1999, and a desktop-oriented version, Mac OS X v10.0 "Cheetah" followed on March 24, 2001. Releases of Mac OS X are named after big cats: for example, Mac OS X v10.6 is usually referred to by Apple and users as "Snow Leopard". The server edition, Mac OS X Server, is architecturally identical to its desktop counterpart, and includes tools to facilitate management of workgroups of Mac OS X machines, and to provide access to network services. These tools include a mail transfer agent, a Samba server, an LDAP server, a domain name server, and others. It is pre-loaded on Apple's Xserve server hardware, but can be run on almost all of Apple's current selling computer models.
Apple also produces specialized versions of Mac OS X for use on four of its consumer devices: the iPhone OS for the iPhone, iPod Touch, and iPad, as well as an unnamed version for the Apple TV.
Monday, March 1, 2010
ADSL2+ / ADSL Plus
TU G.992.5 is an ITU (International Telecommunication Union) standard, also referred to as ADSL2+ or ADSL2Plus.
Commercially it is notable for its maximum theoretical download speed of 24 Mbit/s.
ADSL2+ works the same way as a standard ADSL broadband service: it delivers a fast internet connection over your phone line. What's different is that ADSL2+ can extend the performance of standard broadband - much like a software upgrade can extend the performance of your computer's operating system.
How fast is ADSL 2+?
Speed depends on a number of factors; most importantly, how close you live to your local phone exchange or cabinet. Other things, like the capability of your computer and the quality of the wiring in your house will also have an impact. This is the case for any internet service provider that offers ADSL2+.
In theory, ADSL2+ can deliver maximum download speeds of up to 24Mbps and maximum upload speeds of up to 1.0Mbps. Like any standard ADSL broadband service, the "up to" speeds are the technology's maximum possible speeds and the actual speeds will vary for a number of reasons. Check out our page on broadband speed for more information on this.
Commercially it is notable for its maximum theoretical download speed of 24 Mbit/s.
ADSL2+ works the same way as a standard ADSL broadband service: it delivers a fast internet connection over your phone line. What's different is that ADSL2+ can extend the performance of standard broadband - much like a software upgrade can extend the performance of your computer's operating system.
How fast is ADSL 2+?
Speed depends on a number of factors; most importantly, how close you live to your local phone exchange or cabinet. Other things, like the capability of your computer and the quality of the wiring in your house will also have an impact. This is the case for any internet service provider that offers ADSL2+.
In theory, ADSL2+ can deliver maximum download speeds of up to 24Mbps and maximum upload speeds of up to 1.0Mbps. Like any standard ADSL broadband service, the "up to" speeds are the technology's maximum possible speeds and the actual speeds will vary for a number of reasons. Check out our page on broadband speed for more information on this.
AM2, AM2+, AM3
There seems to be quite alot of people recently asking questions about the differences between am2, am2+, and am3, what works with what, etc.
CPU Sockets:
Socket AM2:
Hypertransport 2.0 (1Ghz)
Socket is 940 pins and will fit in AM2, and AM2+ Sockets
Supports Dual Channel DDR2 Memory
Socket AM2+:
Hypertransport 3.0 (2.6Ghz Max)
Socket is 940 pins, Will work with both AM2+ and AM2 Sockets.
Socket AM2+ cpu's will work in AM2 motherboards depending upon bios revision and chipset.
Supports Dual Channel DDR2 Memory
Socket AM3:
Hypertransport 3.0 (2.6Ghz Max)
Future AM3 Motherboards will support up to 3.2Ghz with Hypertransport 3.1
Socket is 941 pins, Will only fit AM3 Cpu's in an AM3 motherboard.
AM3 Cpu's are backward compatible with AM2 and AM2+ motherboards, depending on chipset and bios revisions.
Supports Dual Channel DDR2 when run on AM2/AM2+ Motherboards.
Supports Dual Channel DDR3 when run on AM3 Motherboards.
_______________
AMD CHIPSETS:
AMD 790 Chipset Variants:
790FX:
x16,x16 Crossfire | x16, x16, x8 Triple Crossfire | x8, x8, x8, x8 Quad Crossfire
65nm Process
790X:
x8,x8 Crossfire
65nm Process
790GX:
x8,x8 Crossfire| Hybrid Crossfire X(ATI Hybrid Graphics, ATI Powerplay)
55nm Process
HD3200 Onboard Graphics
AMD 785G:
x16, x4 Crossfire |or x8, x8 Crossfire Depending on motherboard| Hybrid Crossfire X(ATI Hybrid Graphics, ATI Powerplay)
55nm Process
HD4200 Onboard Graphics
AMD 780 Chipset Variants:
780G:
x16,x8 crossfire on select motherboards | Hybrid Crossfire X(ATI Hybrid Graphics, ATI Powerplay)
55nm Process
HD3200 Onboard Graphics (128mb Sideport Memory select motherboards)
780V:
No Crossfire
55nm Process
Radeon 3100 Onboard Graphics
AMD 770:
No Official Crossfire Compatibility | x16, x4 Crossfire on Select Motherboards|
65nm Process
AMD 760G:
Hybrid Crossfire X(ATI Hybrid Graphics, ATI Powerplay)
55nm Process
Radeon HD 3000 Onboard Graphics
AMD 740G:
No Crossfire
55nm Process
Radeon 2100 Onboard Graphics
_______________
Nvidia Chipsets:
Nvidia nForce 980a
Rebadged 780a with AM3/DDR3 Motherboard support
x16,x16 SLI| x8,x8,x8 Triple SLI| Hybrid SLI(GeForce Boost,Hybrid Power)
Current Motherboards DDR2|Chipset Supports DDR3/AM3
65nm Process
Geforce 8200 Onboard Graphics
Nvidia nForce 780a:
x16, x16 SLI| x16, x8, x8 Triple SLI| Hybrid SLI(GeForce Boost,Hybrid Power)
65nm Process
Geforce 8200 Onboard Graphics
Nvidia nForce 750a:
x8, x8 SLI | Hybrid SLI(GeForce Boost,Hybrid Power)
65nm Process
GeForce 8200 Onboard Graphics
Nvidia Nforce 730a:
Hybrid SLI(GeForce Boost,Hybrid Power)
65nm Process
GeForce 8300 Onboard Graphics
Nvidia nForce 720a:
Hybrid SLI(GeForce Boost,Hybrid Power)
65nm Process
GeForce 8200 Onboard Graphics
_______________
CPU's by Socket:
AM2+:
All Original(65nm) Phenoms(Agena, Kuma, Toliman)
Phenom II X4 920
Phenom II X4 940
AM3:
Athlon II X2 240 2.8Ghz
Athlon II X2 245 2.9Ghz
Athlon II X2 250 3.0Ghz
Phenom II X2 545 3.0Ghz
Phenom II X2 550 Black Edition 3.1Ghz
Phenom II X3 705e 2.5Ghz
Phenom II X3 710 2.6Ghz
Phenom II X3 720 Black Edition 2.8Ghz
Phenom II X4 810 2.6Ghz
Phenom II X4 905e 2.5Ghz
Phenom II X4 945 3.0Ghz
Phenom II X4 955 3.2Ghz
Phenom II X4 965 3.4Ghz
______________
Notes:
- AMD Chipsets tend to have higher backward compatibility with newer processors
- Nvidia Chipsets Support SLI, While AMD Chipsets Support Crossfire
- Generally speaking, Socket AM3 will provide longer future compatibility over socket AM2+
CPU Sockets:
Socket AM2:
Hypertransport 2.0 (1Ghz)
Socket is 940 pins and will fit in AM2, and AM2+ Sockets
Supports Dual Channel DDR2 Memory
Socket AM2+:
Hypertransport 3.0 (2.6Ghz Max)
Socket is 940 pins, Will work with both AM2+ and AM2 Sockets.
Socket AM2+ cpu's will work in AM2 motherboards depending upon bios revision and chipset.
Supports Dual Channel DDR2 Memory
Socket AM3:
Hypertransport 3.0 (2.6Ghz Max)
Future AM3 Motherboards will support up to 3.2Ghz with Hypertransport 3.1
Socket is 941 pins, Will only fit AM3 Cpu's in an AM3 motherboard.
AM3 Cpu's are backward compatible with AM2 and AM2+ motherboards, depending on chipset and bios revisions.
Supports Dual Channel DDR2 when run on AM2/AM2+ Motherboards.
Supports Dual Channel DDR3 when run on AM3 Motherboards.
_______________
AMD CHIPSETS:
AMD 790 Chipset Variants:
790FX:
x16,x16 Crossfire | x16, x16, x8 Triple Crossfire | x8, x8, x8, x8 Quad Crossfire
65nm Process
790X:
x8,x8 Crossfire
65nm Process
790GX:
x8,x8 Crossfire| Hybrid Crossfire X(ATI Hybrid Graphics, ATI Powerplay)
55nm Process
HD3200 Onboard Graphics
AMD 785G:
x16, x4 Crossfire |or x8, x8 Crossfire Depending on motherboard| Hybrid Crossfire X(ATI Hybrid Graphics, ATI Powerplay)
55nm Process
HD4200 Onboard Graphics
AMD 780 Chipset Variants:
780G:
x16,x8 crossfire on select motherboards | Hybrid Crossfire X(ATI Hybrid Graphics, ATI Powerplay)
55nm Process
HD3200 Onboard Graphics (128mb Sideport Memory select motherboards)
780V:
No Crossfire
55nm Process
Radeon 3100 Onboard Graphics
AMD 770:
No Official Crossfire Compatibility | x16, x4 Crossfire on Select Motherboards|
65nm Process
AMD 760G:
Hybrid Crossfire X(ATI Hybrid Graphics, ATI Powerplay)
55nm Process
Radeon HD 3000 Onboard Graphics
AMD 740G:
No Crossfire
55nm Process
Radeon 2100 Onboard Graphics
_______________
Nvidia Chipsets:
Nvidia nForce 980a
Rebadged 780a with AM3/DDR3 Motherboard support
x16,x16 SLI| x8,x8,x8 Triple SLI| Hybrid SLI(GeForce Boost,Hybrid Power)
Current Motherboards DDR2|Chipset Supports DDR3/AM3
65nm Process
Geforce 8200 Onboard Graphics
Nvidia nForce 780a:
x16, x16 SLI| x16, x8, x8 Triple SLI| Hybrid SLI(GeForce Boost,Hybrid Power)
65nm Process
Geforce 8200 Onboard Graphics
Nvidia nForce 750a:
x8, x8 SLI | Hybrid SLI(GeForce Boost,Hybrid Power)
65nm Process
GeForce 8200 Onboard Graphics
Nvidia Nforce 730a:
Hybrid SLI(GeForce Boost,Hybrid Power)
65nm Process
GeForce 8300 Onboard Graphics
Nvidia nForce 720a:
Hybrid SLI(GeForce Boost,Hybrid Power)
65nm Process
GeForce 8200 Onboard Graphics
_______________
CPU's by Socket:
AM2+:
All Original(65nm) Phenoms(Agena, Kuma, Toliman)
Phenom II X4 920
Phenom II X4 940
AM3:
Athlon II X2 240 2.8Ghz
Athlon II X2 245 2.9Ghz
Athlon II X2 250 3.0Ghz
Phenom II X2 545 3.0Ghz
Phenom II X2 550 Black Edition 3.1Ghz
Phenom II X3 705e 2.5Ghz
Phenom II X3 710 2.6Ghz
Phenom II X3 720 Black Edition 2.8Ghz
Phenom II X4 810 2.6Ghz
Phenom II X4 905e 2.5Ghz
Phenom II X4 945 3.0Ghz
Phenom II X4 955 3.2Ghz
Phenom II X4 965 3.4Ghz
______________
Notes:
- AMD Chipsets tend to have higher backward compatibility with newer processors
- Nvidia Chipsets Support SLI, While AMD Chipsets Support Crossfire
- Generally speaking, Socket AM3 will provide longer future compatibility over socket AM2+
Subscribe to:
Posts (Atom)