The Operating System Hardening Issues And Practices Information Technology Essay
The idea of operating system hardening is aimed at minimizing a computers exposure to current and even future threats by configuring the operating system and removing applications and devices that are unnecessary. In the modern world, security takes an increasingly predominantly important role. The Information technology industry is facing challenges in public confidence at the discovery of vulnerabilities, and clients are expecting security to be delivered out of the box, even on programs that were designed with no security issues in mind. Software maintainers have to face the challenge of improving the security of their programs. Unfortunately, they are often under-equipped to do so (Bagchi, 2003).
Operating system hardening basically means installation of a new server in a more secure fashion and maintenance of the security and integrity of the application software and server. Planning to acquire a virtual private server or a dedicated server calls for adequate preparation of a server-hardening checklist before launching the web application on the server.
The running UNIX which is used by almost 66% of Internet servers first and foremost must
Configure the host and system file for hardening (Pichnarczyk et al, 1994).
System administrators should make sure that the UNIX server is performing at its best by configuring the system control files. This is essential to optimization of operation of the server. Apart from this, secure patch updates are an essential factor to look into while hardening the server.
Every Operating system comes with some security vulnerabilities. This calls for hardening of Windows, UNIX and other operating systems to reduce vulnerabilities.
The Information Systems manager looking after corporate servers, databases and firewalls should have knowledge of the fundamentals of operating system hardening. Operating system hardening is the black art that ensures all known operating system vulnerabilities are plugged, and monitored.
Operating system hardening is an important feature in securing computing. One must ensure that network performance and security does prevent network problems, conduct effective troubleshooting and also quickly take necessary actions to solve possible problems.
We need to know how the network bandwidth is used to help us in accounting, auditing and network planning processes. Monitoring of network traffic and conducting of forensic analysis ensures adherence to company policies any violation is recorded and stopped timely.
The process of securing an operating system is referred to as “hardening” or at times as “lock down.” It is important to an organization’s security because it reduces the chances of attack. Issues come up because various operating systems require different hardening processes within a data center. For example, a system administrator may have deep expertise in one operating system but may possess limited or even no expertise with the other operating systems. System administrators also have a task of ensuring that each server remains locked down consistently to avoid vulnerabilities.
This requires maintenance of a predetermined configuration policy on every server. Implementing safety security measures ensures data is less susceptible to malicious attacks and able to rebound more quickly when attacks occur. One of the best practices for the security of an operating system includes a standardized hardening of all servers using appropriate standard methodologies.
Maintenance of security is a challenge. It depends on the size of the organization and number of servers. This in essence determines the amount of resources required. Use of automated tools can reduce the number of resources required and also helps with consistency. In addition, centralized view of all servers (regardless of the operating system) provides system administrators with the ability of assessing the security state of groups of servers and also lock them down automatically.
http://sun.systemnews.com/images/Banners/SB-SLEEP-040810.gifOperating System Hardening
Hardening of operating systems involves ensuring that the configuring of the system so as to limit the possibility of either internal or even external attack. While the hardening methods vary from one operating system to another, the baselining of Windows, UNIX, Linux, MacOS X or any other system has the same concepts. Some basic hardening techniques include the following:
Non-essential services – It is of importance that an operating system can only be configured to perform tasks for which it is assigned. For example, it is not necessary to have SMTP or HTTP services run on the system only when a host is functioning as a mail server or web.
Fixes and Patches -This is an ongoing endeavor. Operating systems should be updated with the most recent security updates.
Password Management – Many operating systems provide options for strong passwords enforcement. This ensures that users are prevented from configuration of weak passwords which can be easily guessed. Additionally, users are able to regularly change passwords. User accounts are disabled after a number of failed attempts to login.
Unnecessary accounts – Unused user accounts that are unnecessary must be disabled or even deleted from the operating systems. It is crucial to monitor employee turnover; when the employees leave an organization their accounts should be disabled or deleted.
Directory and File Protection – There should be strict controlled measures put in place for accessing directories and files through use of file permissions and access control lists.
File System Encryption – Some file systems support encryption of files and folders. For additional security, it is vital to ensure that partitions in a hard disk are formatted together using a file system that supports encryption features like NTFS for Windows and so on. This should be applied to sensitive data.
Enable Logging – The Operating system should be configured to ensure logging of all errors, activities and warnings.
File Sharing – Unnecessary file sharing should be disabled
Hardening the Network- Different techniques can be used to achieve this such as updating hardware and software (This is a vital part of network hardening; it involves ensuring that the network firmware in the routers are updated regularly with the latest fixes and patches).
Password Protection – Wireless access points and routers have provisions for interfaces that allow the network to be remotely management and accessed. Such devices should therefore be protected by strong passwords.
Unnecessary Services and Protocols – All services and protocols that are unnecessary should be disabled; ideally, the hosts in the network should have such services and protocols deleted. A case in point is that, it is unnecessary to have protocols for AppleTalk installed and configured on any systems in a pure TCP/IP network environment.
Ports – Ports that are not needed should be blocked by a firewall and any associated services in the hosts in the network should be disabled. For example, it is not logical to allow traffic for port 80 to pass through the firewall in a network that has a host acting as a web server.
Wireless Security – There configured of wireless networks up to highest available security level.
Restricted Network Access – A number of steps should be taken in order to prevent unauthorized access to internal networks. First and foremost, a firewall should be used between the network and the internet. Other available options may include the use of access control lists (ACLs) and Network Address Translation (NAT). Authorized remote access should only be enabled using secure tunnels and virtual private networks (Bagchi, 2003).
Application Hardening
This involves inclusion of all applications and services installed in host systems in a network in the hardening procedure; this ensures they do not pose as weak or vulnerable links for the security defenses. Services and applications that come installed with default settings with the operating system need to be reviewed.
Web Servers
These are responsible for availing web pages requested by web browsers; web hosting is among the most common web server services available.
Private sites require methods for authentication in place. Intranet (sites restricted for use to internal members only) approach can be used to block unauthorized access by external users; a firewall is used for this.
Web server logs need to be reviewed routinely for any suspicious activity.
As is the case with all software’s, necessary steps should be taken to ensure that web servers are updated with the latest vendor supplied patches.
Email Servers
To secure mail servers, configuration options that are not needed by the software running in the mail servers should be disabled and the most recent updates supplied by vendors utilized. Authentication should be used so as to ensure that only authorized users are able to receive and send email messages.
FTP Servers
File Transfer Protocol (FTP) allows files to be uploaded to and downloaded from remote servers. Servers can be accessed by either authenticated FTP or anonymous FTP. Caution should be taken when using anonymous FTP accounts; they require regular monitoring as they can very easily be exploited by hackers. In case FTP is authenticated, Secure FTP is used to encrypt login IDs and passwords, rather than allowing data to be transferred as plain text.
DNS Servers
DNS (Domain Name Servers) translate web site URL addresses to the equivalent IP addresses for routing across network devices. DNS software should be updated regularly and security measures like authentication should be employed to prevent ‘illegal’ transfers to unauthorized zones.
Unauthorized access or server intrusion can be mitigated or prevented by disabling or blocking port 53, or by using one or more pre-determined external systems to restrict access to the DNS server.
Service detail
Threats on operating system are real. Operating systems can be susceptible to a number of internal and external attacks due to un-patched vulnerabilities, disgruntled employees or in some case mis-configured server settings. Hardening of the operating system protects the organization against security breaches and malicious attacks. It also improves the overall efficiency of the operating system environment by verifying user permissions, patching vulnerabilities, installation of necessary software updates and deactivating unnecessary programs.
Services Identification
The purpose of each Unix Server should be identified.
How the computer will be used.
What categories of information will be stored in the computer?
What kind of information will be processed by the computer?
What are the security requirements and concerns for the information?
What network services will be provided by the computer?
What are the security requirements for these network services?
Identify the network services that will be provided on the server.
Servers should be dedicated to a single service. This simplifies the configuration and reduces the likelihood of configuration errors. In the case of the servers, the application server should always be limited to www or https services.
The db2 server should be ports 50000 and 50001. This can also eliminate unexpected and unsafe interactions among the services that present opportunities unauthorized intruders. In some cases, it may be required to offer more than one service on a single host computer. For example, the server software from vendors combines the hypertext transfer protocol (HTTP) and the file transfer protocol (FTP) services in a single package (Bookholdt, 1989).
It may therefore be appropriate to provide access to public information through both FTP and HTTP protocols from the same server host. I don’t recommend this because it is a less secure configuration.
Determine how the servers will be connected to your network. There are many concerns that relate to network connections that can affect the configuration and use of any one computer. Many organizations usually use a broadcast technology such as Ethernet for their local area networks. In such cases, information traversing a network segment can be seen by any computer on that segment. This requires that you should only place “trusted” computers on the same network segment, if not, encrypts information before transmitting it.
Environment Analysis
Considerations that must be taken into account before implementation can begin on a UNIX server include; what is the version of operating system that exists on the system? Where will the server be located? What kind of programs will be running on this server? These three questions need to be answered so as to direct the path of a security policy. Depending on the answer to these questions, the level of security can then be determined. For example, if a server simply runs audits, it is most likely then to have the same subnet as that of other servers (Sumitabh, 2006).
.
Planning
This is where you must define the overall security policies and goals. In many organizations, this step is normally performed at the corporate level, and is likely therefore to have already been completed.
How much security is needed (High, Medium &Low)?
How much security can your business afford (Cost & time)?
What devices and components should be protected?
Security Policy
The server in an open environment that is easily accessible by more than one system administrators is physically vulnerable to threats. There is need to secure data and encrypt it as well as server security protecting these files from There should be server security protection of these files from possible copying and removal. If System Administrator beliefs that the information on the server is sensitive to the organization, then he is at liberty to request higher security.
Password policy
The computers should not be trusted to remain secure and should therefore be physically disconnected if there are any signs of vulnerability.
When more than one administrator is permitted to log in, a server stands a chance of being vulnerable to employees with access.
ACL Policy
Untrained users who are not familiar with the operating system and have access to the server may accidentally damage a system without knowledge of how to recover .There is therefore need to maintain the integrity and privacy of the system and server hardening features to avoid unauthorized malicious penetrations and exploits.
Monitoring
As soon as the infrastructure is built, the systems administrator is required to continuously monitor it for potential attacks and threats. A viable option is performing weekly audits; this prevents frustrating other network users by slowing down the network with unnecessary traffic. Problems discovered should then be addressed through the previous phases in order to find the best possible solution.
Incident Response
It beats logic to respond to an attack when exposure has already taken place for a considerable period. One should react immediately incase of an attack in the system (CERT/CC, 2002).
The Management establishes a security policy based on the risks it is willing to bear. The policy itself does not set goals, but serves as a bridge between management’s goals and the aspect of technical implementation. There is therefore need to develop a server deployment plan that includes security issues. Most deployment plans only address the cost of the computers, installation of applications software, schedules to minimize work disruption, and user training.
Administrators charged with the responsibility of maintaining and operating the servers are the only ones that should be permitted access to the servers. User privileges on the servers should also be determined. Authentication protocols should also be made.
Authentication can be in two forms: (1) operating system authentication usually employed for authentication of administrators and, (2) network authentication used to authentication information systems users (McLean, 1996).
Conclusion
Risks can be mitigated upon by an on going process of hardening of Operating Systems, Network, Database, Application and devices as well as relevant resources of the IT infrastructure. To minimize risk, contingency plan remains the most effective and efficient plan to safe guard the Organization’s Information system from collapse. Operating system hardening, anti-virus solution, periodical security patches up offer prevention, detection and corrective action plan are of benefit to any organization that has an information system in place.
In summary, the risks assessment processes are about making decisions so as to minimize the risks. The impact of an attack and the level of acceptable risk for any given situation is a fundamental policy decision.
Likewise, vulnerabilities are basically design issues that must be addressed during the design, development & implementation of information systems. One of the fundamental problems of risk management is to achieve a cost effective balance between design characteristics and the related counter measures to threats and their impacts on the organization.
Modern computing environments are distributed infrastructures which require any organization to develop intrusion detection strategies for the servers. I strongly believe that there are no sensors on the internal network. Many of the common intrusion detection methods therefore solely depend on the existence of various logs that the operating system can produce and on the availability of auditing tools that analyze these logs. This will be of help when installation of appropriate software tools and configuration of these tools and the operating system so as to collect and manage the necessary information.
The Organization must likewise update its computer deployment plan when relevant changes take place. These changes may be through new technologies, updates to your network architecture, new security threats and the addition of new classes of users or new organizational units. The environment will only work effectively and efficiently if the process is centralized.
It is of importantance for financial institutions to develop, implement and monitor appropriate information security programs.
Whether systems are maintained in-house or by a third-party vendor, appropriate security controls and risk management techniques should be employed. A security program is composed of system architecture and effective security policies that should be supported by the risk assessment practices and tools (Bagchi, 2003). IT threats and counter measures evolve continuously.
Consequently, proactive risk assessment processes for institutions should be in place to identify emerging vulnerabilities or threats to the information systems. A solid IT security policy should identify detection, response and prevention measures to be taken. Preventive measures consist of assessment tools for determining vulnerability regularly and periodic system analysis. Intrusion detection tools can also be effective in detecting potential intrusions and system misuse.
This paper will be of help to Security administrators because they will have a basic knowledge of security tools, operating system hardening, intrusion detection and auditing. This knowledge can be applied directly to their servers and many vulnerable loopholes will be filled.
Therefore, it is critical that every operating system security-minded administrator is able to maintain their knowledge of security by researching and referring to various Internet resources.
By utilizing this paper, the overall security of operating systems will be improved dramatically (Bookholdt, 1989).
Abstract
Operating system hardening can be defined as the process that addresses security weaknesses in an operating system by implementing the latest operating systems patches, hot fixes and as well updates as follow up to the specific policies and procedures to reduce attacks and system down time.
Hardening is a not a one time activity, it is an ongoing task that mitigates the risk to performing high quality of computing. We have to install a secure production server in such a way as to remove unwanted devices, not allowing the default setting, fix up the miss configurations, developing new system programming, enhancement of the current configuration and applying new security patches before going to the production environment.
Hardening of the operating system should be able to support high integrity, availability, reliability, scalability privacy and confidentiality at the reduced levels so as to achieve the highest level of objective – benefits from the critical Information Technology infrastructure of the organization.
Safeguarding information and protecting the integrity of the organization’s network systems are very vital. IT security personnel in many companies have established policies that are applicable to their entire organization, but it is up to individual departments that manage the systems to implement security policies (Schneier, 1996).
Security professionals also recognize the need for flexibility when it comes to the implementation stage. This is due to the unique requirements of each department. Hardening of an operating system involves the entire removal of all tools that are not essential, utilities and many other systems administration options, any of which could be used to ease the way of a hacker’s path to your systems (Bento, 2003).
Because of this, the hardening process will go a long way in ensuring that all appropriate security features are activated and configured correctly. This paper especially focuses on the hardening of operating systems. In this paper, I define the concept of software security hardening, which allows the developers and maintainers to deploy and harden security features and remedy present vulnerabilities and threats in the already existing open source software.
I also do propose a system where classification is done of the different levels at which the hardening can be applied as well as a methodology for hardening of high level security into applications based on a well-defined security ontology. In addition to this contribution, I will endeavor to elaborate the methods for hardening security vulnerabilities.
The scope of this paper includes enforcement of adequate password rules, implementing proper user-security mechanisms, enabling efficient system auditing, and monitoring of file and directory access.Â
Order Now