Presentation is loading. Please wait.

Presentation is loading. Please wait.

Data Modeling Best Practices Workflow

Similar presentations


Presentation on theme: "Data Modeling Best Practices Workflow"— Presentation transcript:

1 Data Modeling Best Practices Workflow
Data Architect Developers Logical Data Modelers Physical Data Modelers Modeling Repository Business Managers These reporting and workflow enhancements with Toad 9.0 enable the Development Best Practices workflow you see above. Development teams are often geographically dispersed with varying levels of quality. One problem this presents is that organizations can rarely OBJECTIVELY identify the QUALITY of the code. Typically if the code “works” and returns the right results, it gets staged and moved to production. Later that same code becomes responsible for costly DBA troubleshooting time as it becomes a problem in production. Finding it is hard enough, but because most developers don’t adhere to any standard for SQL Code, having another developer fix it takes longer than it should. Show the tools quickly, either with screenshots or short demoes that relate to this part of the process (CLAUDIA AND STEVEN) As you can see in Step 2, by automating the code review and SQL scan process, your dev managers can quickly and objectively identify poorly written code. This approach allows you to quickly spot trends from a particular group that may continually fall short of standards and allows you to proactively take action to correct the problem. Without this objective reporting, it is nearly impossible to control code quality in most organizations. Once the code is semantically correct, it is important to be sure SQL is 100% optimized. The simple fact of the matter is that most developers never take this extra step and that is a recipe for difficult-to-find production problems. Tuning SQL is a long, cumbersome task that requires a deep understanding of the database optimization engine. Frankly, it is something that only the top 5% of developers do well due to the fact that more and more application developers are writing SQL Code. And even old-school SQL programmers who know what they’re doing often lack the time to dive in and optimize code the way you would like. Our automated approach is fast, easy, and guarantees the right results every time, and can be done while the dev teams are home sleeping! Now that the code is semantically correct and optimized, it’s time to validate that it will, in fact, stand up under the weight of a production load. When production load is introduced into a system, the internal profiles for the SQL can change dramatically to the database optimizer. This is one of the most common problems in moving from test to production and is impossible to determine without some form of load testing. However, most staging development teams are not going to use a high-end testing tool to ensure durability. Quest provides the surgical approach with Benchmark Factory for databases, a lightweight, easy-to-install tool made for exactly this purpose. You don’t want a bulldozer to help you plant your landscaping flowers, and you don’t want a high-end, complicated load tool to validate SQL. At the end of this process, you have tackled three of the most difficult problems that have plagued SQL Development managers for years, and that is an EASY and AUTOMATED way to GUARANTEE that: Coding standards are being met SQL code is optimized Your systems will stand up to the scrutiny of a production load JAD Sessions Initial Physical Design Business End-Users Process Model DBA Systems People


Download ppt "Data Modeling Best Practices Workflow"

Similar presentations


Ads by Google