I. Introduction
“Everything has changed, except our way of thinking.” - Albert Einstein
Data are a vital organizational resource that needs to be managed like other important business assets. Today’s business enterprises cannot survive or succeed without quality data about their internal operations and external environment. This growth drives corporations to analyze every bit of information that is extracted from huge data warehouses for competitive advantage. This has turned the data storage and management function into a key strategic role of information age.
I. Background
1. Evolution of Data Systems
The demand for information has rise in every organization. Organizations have steadily become global and widespread. Organizations
…show more content…
Database Development
Database development involves data planning and database design activities. Data models that support business processes are used to develop databases that meet the information needs of the users.
8. Importance of Database Vulnerability Assessment
Many organizations rely on database vulnerability assessments as part of their information security strategies to ensure data is protected and to guard against information theft that could expose a company to liability.
A database vulnerability assessment hunts for weaknesses in databases and searches for anything out of the ordinary, allowing organizations to act before they’re subjected to a devastating attack.
9. Database Components By Tools
i. Oracle
Oracle Database 11g Standard Edition is optimized for deployment in midsized business environments. It is supported by Windows, Linux and Unix. ii. MySQL
The MySQL database has become the world's most popular open source database because of its consistent fast performance, high reliability and ease of use. MySQL runs on more than 20 platforms including Linux, Windows, OS/X, HP-UX, AIX, Netware, giving you the kind of flexibility that puts you in control. MySQL offers one of the most powerful transactional
Up until this point, Third Star Financial Services has operated via a succession of mergers and acquisitions where systems were inherited but never integrated into the network. Its data management has been virtually non-existent and entirely ineffective. Evidence of this can be found in the absence of an enterprise-wide data management solution and the presence of several disparate systems operating independently with no measurable benefit to the company. Due to a lack of actionable data, management makes decisions based on instinct rather than through analysis. A direct consequence of this is a steadily declining market share and loss of high-level employees to competing companies. Fortunately, this discrepancy has been identified and Third Star executives have established the new goal of modernizing and streamlining operations. Using concepts outlined by the Data Management Association (DAMA), this proposed enterprise architecture will allow Third Star to transform their data from a liability to an asset.
Companies should develop a control that requires that routine vulnerability assessment of their customer facing web sites, network infrastructure, and associated systems (such as database systems). Vulnerability assessment can help identify potential weaknesses to systems and also provide a sort of feedback to the organization’s IT department on their current operational policy and security posture. The cost of performing a routine vulnerability assessment is considerably less than that of an actual data breach.
There are numerous great open source software solutions for database servers, like, H2, HyperSQL, MySQL, mysql, Oracle, and PostgreSQL, just to name a few. They all offer topnotch functionality, performance, scalability, and security. As far as which one is the best, I recommend PostgreSQL. PostgreSQL is an object-relational Database software solution that offers some of the most feature rich options as compared to the bigger commercial manufacturers like Oracle, IBM, Sybase and Informix, and the best part of it, it 's free. It 's also one of the first database software that was released, and it has a proven track record with over 23 years of active development. It was created back in 1989. The only other DB software that came out before it is Oracle, which was created back in 1979. Now PostgreSQL might not be the fastest, but It more than makes up for it with its functionality. It allows the use of two different types of interfaces, a GUI (for those who like the point-click style) and a SQL. It works on most OSs like windows, Linux, Mac, Unix, etc. It has a vast array of services and tools that is included to streamline the administration of the Database. Here are just some examples; Full ACID (Atomicity, Consistency, Isolation, & Durability) compliancy, commercial & noncommercial support, triggers support, user defined data type support, stored procedure support, online backup, multiple index type input support, embedded
Which database management system platform should I use? This is a very common question that developers ask themselves when they work on a project that requires storing and querying data. There are 4 well-known platforms that people may consider; they are: Oracle, Microsoft SQL, Teradata and DB2. This essay will compare and contrast the differences and similarities between these fours platforms.
As it applies to an IT environment, a vulnerability assessment is used to identify existing vulnerabilities giving the environment owner an awareness of what needs to be fixed (Who needs a Vulnerability Assessment, 2017). The assessment needs to be viewed for what it is, a onetime occurrence that in no way highlights all vulnerabilities. Multiple assessments of vulnerability must be conducted over time to ensure that as many possible avenues of weakness are explored, identified, and marked for improvement. As new systems are added, programs changed, or other changes to the system are made vulnerabilities might be created.
This must be in your own words and not copied and pasted from the original source. Include the purpose of the database and the subject matter it covers. This may be four or five sentences; and
The periodic assessment of risk to agency operations or assets resulting from the operation of an information system is an important activity. It summarizes the risks associated with the vulnerabilities identified during the vulnerability scan. Impact refers to the magnitude of potential harm that may be caused by successful exploitation. It is determined by the value of the resource at risk, both in terms of its inherent (replacement) value, its importance (criticality) to business missions, and the sensitivity of data contained within the system. The results of the system security categorization estimations for each system, is used as an aid to determining individual impact estimations for each finding. The level of impact is rated
A vulnerability assessment is a risk testing process which finds, quantity and rank possible vulnerabilities to threats in as many security defects as possible in a given timeframe. Depend upon organization scope there are many way to conduct vulnerability assessment. This assessment may involve automated and manual techniques.
Risk assessment and threat assessment should go hand-in-hand.The outcome of the risk assessment and threat assessment should provide recommendations that maximize the protection of confidentiality, integrity and availability while still providing functionality and usability. The purpose of a risk assessment is to ensure sensitive data and valuable assets are protected. An organization should take a hard look at who has access to sensitive data and if those accesses are required. The security audit should monitor the companies systems and users to detect illicit activity.The security audit should
Upon analyzing the security risk for each new requirement, we used the value points ranging from 1, 2, 3, 5, 8, 13, 20, 40, and 100 of the asset in the database table. We then determined the ease points using criteria for easiest to hardest to attack. With ease of attack values and values of asset, we could determine which requirement was more vulnerable calculating the
The database used should be open and industry standard to allow easy integration with other applications and easy movement of data in the future. The database
In 1977, Larry Ellison, Bob Miner, and Ed Oates founded System Development Laboratories. After being inspired by a research paper written in 1970 by an IBM researcher titled “A Relational Model of Data for Large Shared Data Banks” they decided to build a new type of database called a relational database system. The original project on the relational database system was for the government (Central
With advances in technology constantly happening, it can be hard to keep up with all of the latest trends. If organizations cannot keep up with the latest trends, it can lead to flaws in their security. Any flaws in security can have a detrimental effect on an organization’s database. Almost every organization has some sort of database, whether it is for maintaining customers, inventory, or vital information.
Information security professional’s job is to deploy the right safeguards, evaluating risks against critical assets and to mitigate those threats and vulnerabilities. Management can ensure their company’s assets, such as data, remain intact by finding the latest technology and implementing the right policies. Risk management focuses on analyzing risk and mitigating actions to reduce that risk. Successful implementation of security safeguards depends on the knowledge and experience of information security staff. This paper addresses the methods and fundamentals on how to systematically conduct risk assessments on the security risks of information systems.
Database servers are expected to reach the needs of the business, market and end users by providing tremendous performance. Since companies are moving towards “big data” technology to support larger audience there is always a need to have a performance enriched data warehouse server running behind to accommodate the needs of end users. A traditional database server is capable of handling gigs of data, providing a minimal amount of performance. This is due to the restricted amount of pre-defined configurations and parameters the servers come with. A traditional database server cannot match up the performance of a data warehouse server running on a massively parallel asymmetrical engine.