• Computing & IT
      January 2014

      Non-Invasive Data Governance

      The Path of Least Resistance and Greatest Success

      by Robert S. Seiner

      Data-governance programs focus on authority and accountability for the management of data as a valued organizational asset. Data Governance should not be about command-and-control, yet at times could become invasive or threatening to the work, people and culture of an organization. Non-Invasive Data Governance™ focuses on formalizing existing accountability for the management of data and improving formal communications, protection, and quality efforts through effective stewarding of data resources. Non-Invasive Data Governance will provide you with a complete set of tools to help you deliver a successful data governance program. Learn how: Steward responsibilities can be identified and recognized, formalized, and engaged according to their existing responsibility rather than being assigned or handed to people as more work. Governance of information can be applied to existing policies, standard operating procedures, practices, and methodologies, rather than being introduced or emphasized as new processes or methods. Governance of information can support all data integration, risk management, business intelligence and master data management activities rather than imposing inconsistent rigor to these initiatives. A practical and non-threatening approach can be applied to governing information and promoting stewardship of data as a cross-organization asset. Best practices and key concepts of this non-threatening approach can be communicated effectively to leverage strengths and address opportunities to improve.

    • Computing & IT
      February 2008

      fruITion

      Creating the Ultimate Corporate Strategy for Information Technology

      by Chris Potts

      Ian is a Chief Information Officer (CIO) who is about to go on a journey of change - whether he likes it or not. He will be expected to explore, challenge and radically recast the complex, often hostile relationships that can exist between a business and the people in its Information Technology (IT) department. On the way, Ian, his Chief Executive Officer, Chief Financial Officer and other key stakeholders, experience a transformation in how a business needs to think about the value of its IT people and the work that they do. This results in some truly groundbreaking innovations in the scope and contribution of Ian's role as CIO, the people that work for him and the strategy that he leads. Watch the characters in this extraordinary business novel as they meet the challenge, struggle and grow. Share in Ian's transformation, and join the author in observing key messages as the adventure unfolds. Part entertaining novel and part enlightening textbook - FruITion takes the reader through a discovery process revealing indispensable messages about the next generation of strategies for Information Technology. - Jeremy Hall, Managing Director, IRM UK Strategic IT Training FruITion brings vividly to life the issues of being a CIO in today's corporate world and how IT, when properly integrated into the objectives of a business can drive massive value creation. His insights into how to win the engagement war and bring technology strategies alive for the non technical are absolutely spot on. - Steve Adams, COO and Managing Director for Card Services, Euronet Worldwide The modern CIO is to be seen as part of the business rather than a service provider to the business. Chris Potts is at the forefront of thinking that will put us all there if we act on his inspiration. - David Brown, CIO of Scottish WaterMore from the author, Chris PottsThe debate over the CIO role, and about the extent to which it should be about business or technology, is taking place in an increasing vacuum of strategic context. Some CIOs have abandoned strategy altogether, while others persevere with a traditional IT Strategy founded in the mindset of the mainframe era. Meanwhile, business managers and staff continue to develop their knowledge of technology and understanding of how to exploit it. There seems to be a presumption that the next-generation strategic purpose of the CIO will be an incremental step on from what has gone before - significant, maybe, but still incremental. What if the CIO's new strategic context is not incremental but disruptive, requiring a very different mindset and skillset? And, most crucially, what if the corporate strategists - rather than the CIO community - are the ones deciding what context is? Their offer to the CIO: you can become one of the corporate strategists like us, but not with your traditional scope and approach to strategy. What does that offer look like and what does it mean for incumbent CIOs and the people who work for them? Chris Potts works with executives and CIOs in industry-leading companies around the world, formulating and executing the new generation of corporate strategies for exploiting IT. He delivers public seminars that are founded on his own breakthrough work with clients, and has provided training to some of the worlds leading consultancies.

    • Computing & IT
      January 2014

      The Audacity to Spy

      How Government, Business, and Hackers Rob Us of Privacy

      by Catherine Nolan and Ashley M. Wilson, JD

      Ever get the feeling you're being watched? The thieves that steal identities are using cutting-edge, high-tech tools that can take one fact from a social media site, another from an online travel survey, a third from a purchase made via the internet and even access highly confidential medical records. Little by little they piece together your buying habits, your religious and school affiliations, the names of your family and pets, your political views, your driving habits, the places you have vacationed, and much, much more. This is not science fiction and this is not the future, this is what is happening to each and every one of us now - today. And although the vast majority of adults say they are concerned about providing personal information online, nearly 1/3 say they have never used a privacy setting on their computer, never inquired about the charities to whom they donate their money, never worried about someone accessing their medical information and never thought twice about giving a financial institution their social security number over the internet. The Audacity to Spy, written by an attorney with an interest in privacy laws and legislation and her grandmother who is an experienced Information Analyst, reveals the ways in which your identity and personal data have been stolen by various sources. Yes, you should be concerned about the NSA and other government agencies having your phone logs and emails; but you should worry more about the insidious data brokers that are collecting information about you every time you log on to your laptop, use your cell phone, access an app, or use your GPS. Companies are collecting a variety of data about you, combining it with location information, and using it to both personalize their own services and to sell to other advertisers for behavioral marketing. Law enforcement agencies are tracking your car and insurance companies are installing devices to monitor your driving. Clerks are making copies of your credit cards. And if that wasn't enough, the FBI has reported that hackers have been discovered embedding malicious software in two million computers, opening a virtual door for criminals to rifle through users's valuable personal and financial information. More than warning you about the ways your data can be stolen, at the end of each chapter are suggestions for limiting the amount of personal data that is available to be seized and divulged. Can you completely cut off the flow of information about yourself? The answer is no, not completely - there is already too much data out there and increasingly sophisticated ways to obtain bits and pieces. But knowing how it is collected, and by whom, gives you the power to control sensitive information and determine how much of your life you wish to expose to those more than willing to exploit it.

    • Computing & IT
      January 2014

      Data Resource Data

      A Comprehensive Data Resource Understanding

      by Michael Brackett

      "Are you struggling to gain a thorough understanding of your organization's data resource? Are you finding that your data resource has become quite disparate through lack of understanding? Are you having difficulty developing meaningful meta-data about your data resource, or understanding the meta-data that have been developed? Do you agonize over finding a way to document your data resource that is thorough, understandable, and readily available? If the answer to any of these questions is Yes, then you need to read Data Resource Data to help you understand your organization's data resource. Most public and private sector organizations do not have a formal process for thoroughly documenting the entire data resource at their disposal, in any meaningful manner, that is readily available to everyone in the organization. Most do not even have a formal design for that documentation. The much abused, misused, misspelled, undefined, and incomplete meta-data are not providing a denotative understanding of the organization's data resource, without which a high quality data resource cannot be developed. Data Resource Data provides the complete detailed data resource model for understanding and managing data as a critical resource of the organization. The model presents formal data resource data as a replacement for the relatively ineffective meta-data. It provides an excellent example of a formal data resource model, compared to a traditional data model, that can be easily implemented by any organization. The use of data resource data ensures a thorough understanding of an organization's data resource and the development of a high quality comparate data resource. Like Data Resource Simplexity, Data Resource Integration, and Data Resource Design, Michael Brackett draws on five decades of data management experience, in a wide variety of different public and private sector organizations, to understand and document an organization's data resource. He leverages theories, concepts, principles, and techniques from many different and varied disciplines, such as human dynamics, mathematics, physics, chemistry, philosophy, and biology, and applies them to the process of formally documenting an organization's data resource.

    • Computing & IT
      August 2014

      Data Resource Data

      A Comprehensive Data Resource Understanding

      by Michael Brackett

      Are you struggling to gain a thorough understanding of your organization's data resource? Are you finding that your data resource has become quite disparate through lack of understanding? Are you having difficulty developing meaningful meta-data about your data resource, or understanding the meta-data that have been developed? Do you agonize over finding a way to document your data resource that is thorough, understandable, and readily available? If the answer to any of these questions is Yes, then you need to read Data Resource Data to help you understand your organization's data resource.Most public and private sector organizations do not have a formal process for thoroughly documenting the entire data resource at their disposal, in any meaningful manner, that is readily available to everyone in the organization. Most do not even have a formal design for that documentation. The much abused, misused, misspelled, undefined, and incomplete meta-data are not providing a denotative understanding of the organization's data resource, without which a high quality data resource cannot be developed.Data Resource Data provides the complete detailed data resource model for understanding and managing data as a critical resource of the organization. The model presents formal data resource data as a replacement for the relatively ineffective meta-data. It provides an excellent example of a formal data resource model, compared to a traditional data model, that can be easily implemented by any organization. The use of data resource data ensures a thorough understanding of an organization's data resource and the development of a high quality comparate data resource.Like Data Resource Simplexity, Data Resource Integration, and Data Resource Design, Michael Brackett draws on five decades of data management experience, in a wide variety of different public and private sector organizations, to understand and document an organization's data resource. He leverages theories, concepts, principles, and techniques from many different and varied disciplines, such as human dynamics, mathematics, physics, chemistry, philosophy, and biology, and applies them to the process of formally documenting an organization's data resource.

    • Computing & IT
      September 2014

      Non-Invasive Data Governance

      The Path of Least Resistance and Greatest Success

      by Robert S. Seiner

      Data-governance programs focus on authority and accountability for the management of data as a valued organizational asset. Data Governance should not be about command-and-control, yet at times could become invasive or threatening to the work, people and culture of an organization. Non-Invasive Data Governance™ focuses on formalizing existing accountability for the management of data and improving formal communications, protection, and quality efforts through effective stewarding of data resources.Non-Invasive Data Governance will provide you with a complete set of tools to help you deliver a successful data governance program. Learn how:Steward responsibilities can be identified and recognized, formalized, and engaged according to their existing responsibility rather than being assigned or handed to people as more work.Governance of information can be applied to existing policies, standard operating procedures, practices, and methodologies, rather than being introduced or emphasized as new processes or methods.Governance of information can support all data integration, risk management, business intelligence and master data management activities rather than imposing inconsistent rigor to these initiatives.A practical and non-threatening approach can be applied to governing information and promoting stewardship of data as a cross-organization asset.Best practices and key concepts of this non-threatening approach can be communicated effectively to leverage strengths and address opportunities to improve.

    • Computing & IT

      The Minimum You Need to Know to Be an OpenVMS Application Developer

      by Roland Hughes

      For years now the question has been surfacing in the OpenVMS community "Where are the pimply faced kids?" The other situation which seems to continually occur is a developer of one language suddenly finding themselves having to modify or maintain an application written in a language completely foreign to them. This book was a year long effort to answer both of those questions. It also should help those to work on a good platform. Once the rudimentaries of logging in, symbols, logicals and the various editors are handled this book takes the reader on a journey of development using the most common tools encountered on the OpenVMS platform and one new tool making headway. A single sample application (a lottery tracking system) is developed using FMS and RMS indexed files in each of the covered languages. (BASIC, FORTRAN, COBOL and C/C++). The reader is exposed on how to use CDD, CMS and MMS with these languages as well. A CD-ROM is included which contains the source, MMS and command files developed through the course of the book. Once RMS has been covered with all of the languages the same application using MySQL with C and FMS is covered. This breaks readers into the use of relational databases if they are not currently familiar with the concept. Rounding out the technical portion of the book is the same application using RDB with FMS. While source code is provided for all of the language implementations only FORTRAN and COBOL are actually covered in the text. It is the hope of the author that this book will prove a useful reference on the desk of every OpenVMS developer. The inclusion of MySQL should benefit both those unfamiliar with relational technology and those platformveterans interested in playing with MySQL for the first time.

    • Computing & IT
      August 2014

      100 Ideas that Changed the Web

      by Jim Boulton

      This innovative title looks at the history of the Web from its early roots in the research projects of the US government to the interactive online world we know and use today. Fully illustrated with images of early computing equipment and the inside story of the online world’s movers and shakers, the book explains the origins of the Web’s key technologies, such as hypertext and mark-up language, the social ideas that underlie its networks, such as open source, and creative commons, and key moments in its development, such as the movement to broadband and the Dotcom Crash. Later ideas look at the origins of social networking and the latest developments on the Web, such as The Cloud and the Semantic Web. Following the design of the previous titles in the series, this book is in a new, smaller format. It provides an informed and fascinating illustrated history of our most used and fastest-developing technology.

    • Computing & IT
      January 1985

      Managing Microcomputers in Large Organizations

      by Board on Telecommunications and Computer Applications, National Research Council

      The information age is taking its toll on traditional office management techniques. According to Infosystems, "If you're cautious of `experts' who claim to have all the answers, then you'll find comfort in the theme of `unleashed creativity' that recurs throughout the 20 essays presented in this book. . . . Organizations will have to devise a strategy for understanding how [a microcomputer's] performance can be monitored. Regardless of what may happen, this book provides managers with appropriate ammunition."

    • Computing & IT
      January 1987

      Computer Assisted Modeling

      Contributions of Computational Approaches to Elucidating Macromolecular Structure and Function

      by Committee on Computer-Assisted Modeling, National Research Council

    • Computing & IT
      January 1990

      Keeping the U.S. Computer Industry Competitive

      Defining the Agenda

      by The Computer Science and Technology Board, National Research Council

      This book warns that retaining U.S. preeminence in computing at the beginning of the next century will require long-term planning, leadership, and collective will that cannot be attained with a business-as-usual approach by industry or government. This consensus emerged from a colloquium of top executives from the U.S. computer sector, university and industry researchers, and government policymakers. Among the major issues discussed are long-term, or strategic, commitment on the part of large firms in the United States; cooperation within and among firms and between industry, universities, and government; weaknesses in manufacturing and in the integration of research, development, and manufacturing; technical standards for both hardware and software manufacture and operation; and education and infrastructure (in particular, computer networks).

    • Computing & IT
      January 1990

      Computers at Risk

      Safe Computing in the Information Age

      by System Security Study Committee, Commission on Physical Sciences, Mathematics, and Applications, National Research Council

      Computers at Risk presents a comprehensive agenda for developing nationwide policies and practices for computer security. Specific recommendations are provided for industry and for government agencies engaged in computer security activities. The volume also outlines problems and opportunities in computer security research, recommends ways to improve the research infrastructure, and suggests topics for investigators. The book explores the diversity of the field, the need to engineer countermeasures based on speculation of what experts think computer attackers may do next, why the technology community has failed to respond to the need for enhanced security systems, how innovators could be encouraged to bring more options to the marketplace, and balancing the importance of security against the right of privacy.

    • Computing & IT
      January 1991

      Intellectual Property Issues in Software

      by Steering Committee for Intellectual Property Issues in Software, National Research Council

      Software is the product of intellectual creativity, but protection of the intellectual property residing in software is the subject of some controversy. This book captures a wide range of perspectives on the topic from industry, academe, and government, drawing on information presented at a workshop and forum.

    • Computing & IT
      February 1991

      Mapping the Brain and Its Functions

      Integrating Enabling Technologies into Neuroscience Research

      by Constance M. Pechura and Joseph B. Martin, Editors; Committee on a National Neural Circuitry Database, Division of Health Sciences Policy, Division of Biobehavioral Sciences and Mental Disorders

      Significant advances in brain research have been made, but investigators who face the resulting explosion of data need new methods to integrate the pieces of the "brain puzzle." Based on the expertise of more than 100 neuroscientists and computer specialists, this new volume examines how computer technology can meet that need. Featuring outstanding color photography, the book presents an overview of the complexity of brain research, which covers the spectrum from human behavior to genetic mechanisms. Advances in vision, substance abuse, pain, and schizophrenia are highlighted. The committee explores the potential benefits of computer graphics, database systems, and communications networks in neuroscience and reviews the available technology. Recommendations center on a proposed Brain Mapping Initiative, with an agenda for implementation and a look at issues such as privacy and accessibility.

    • Computing & IT
      January 1991

      The Future of Statistical Software

      Proceedings of a Forum

      by Panel on Guidelines for Statistical Software, National Research Council

      This book presents guidelines for the development and evaluation of statistical software designed to ensure minimum acceptable statistical functionality as well as ease of interpretation and use. It consists of the proceedings of a forum that focused on three qualities of statistical software: richness--the availability of layers of output sophistication, guidance--how the package helps a user do an analysis and do it well, and exactness--determining if the output is "correct" and when and how to warn of potential problems.

    • Computing & IT
      January 1992

      Computing the Future

      A Broader Agenda for Computer Science and Engineering

      by Committee to Assess the Scope and Direction of Computer Science and Technology, National Research Council

      Computers are increasingly the enabling devices of the information revolution, and computing is becoming ubiquitous in every corner of society, from manufacturing to telecommunications to pharmaceuticals to entertainment. Even more importantly, the face of computing is changing rapidly, as even traditional rivals such as IBM and Apple Computer begin to cooperate and new modes of computing are developed. Computing the Future presents a timely assessment of academic computer science and engineering (CS&E), examining what should be done to ensure continuing progress in making discoveries that will carry computing into the twenty-first century. Most importantly, it advocates a broader research and educational agenda that builds on the field's impressive accomplishments. The volume outlines a framework of priorities for CS&E, along with detailed recommendations for education, funding, and leadership. A core research agenda is outlined for these areas: processors and multiple-processor systems, data communications and networking, software engineering, information storage and retrieval, reliability, and user interfaces. This highly readable volume examines Computer science and engineering as a discipline--how computer scientists and engineers are pushing back the frontiers of their field. How CS&E must change to meet the challenges of the future. The influence of strategic investment by federal agencies in CS&E research. Recent structural changes that affect the interaction of academic CS&E and the business environment. Specific examples of interdisciplinary and applications research in four areas: earth sciences and the environment, computational biology, commercial computing, and the long-term goal of a national electronic library. The volume provides a detailed look at undergraduate CS&E education, highlighting the limitations of four-year programs, and discusses the emerging importance of a master's degree in CS&E and the prospects for broadening the scope of the Ph.D. It also includes a brief look at continuing education.

    • Computing & IT
      March 1996

      Statistical Software Engineering

      by Panel on Statistical Methods in Software Engineering, National Research Council

      This book identifies challenges and opportunities in the development and implementation of software that contain significant statistical content. While emphasizing the relevance of using rigorous statistical and probabilistic techniques in software engineering contexts, it presents opportunities for further research in the statistical sciences and their applications to software engineering. It is intended to motivate and attract new researchers from statistics and the mathematical sciences to attack relevant and pressing problems in the software engineering setting. It describes the "big picture," as this approach provides the context in which statistical methods must be developed. The book's survey nature is directed at the mathematical sciences audience, but software engineers should also find the statistical emphasis refreshing and stimulating. It is hoped that the book will have the effect of seeding the field of statistical software engineering by its indication of opportunities where statistical thinking can help to increase understanding, productivity, and quality of software and software production.

    • Computing & IT
      January 2002

      Broadband

      Bringing Home the Bits

      by Committee on Broadband Last Mile Technology, Computer Science and Telecommunications Board, National Research Council

      Broadband communication expands our opportunities for entertainment, e-commerce and work at home, health care, education, and even e-government. It can make the Internet more useful to more people. But it all hinges on higher capacity in the “first mile†or “last mile†that connects the user to the larger communications network. That connection is often adequate for large organizations such as universities or corporations, but enhanced connections to homes are needed to reap the full social and economic promise. Broadband: Bringing Home the Bits provides a contemporary snapshot of technologies, strategies, and policies for improving our communications and information infrastructure. It explores the potential benefits of broadband, existing and projected demand, progress and failures in deployment, competition in the broadband industry, and costs and who pays them. Explanations of broadband’s alphabet soup â€" HFC, DSL, FTTH, and all the rest â€" are included as well. The report’s finding and recommendations address regulation, the roles of communities, needed research, and other aspects, including implications for the Telecommunications Act of 1996.

    • Computing & IT
      April 2003

      Government Data Centers

      Meeting Increasing Demands

      by Committee on Coping with Increasing Demands on Government Data Centers, Committee on Geophysical and Environmental Data, National Research Council

      Environmental data centers have been successfully acquiring, disseminating, and archiving data for decades. However, the increasing volume and number of data sets, coupled with greater demands from more diverse users, are making it difficult for data centers to maintain the record of environmental change. This workshop report focuses on technological approaches that could enhance the ability of environmental data centers to deal with these challenges, and improve the ability of users to find and use information held in data centers. Among the major findings are that data centers should rely more on off-the-shelf technology -- including software and commonly available hardware -- and should shift from tape to disk as the primary storage medium. Such technological improvements will help solve many data management problems, although data centers and their host agencies will have to continue to invest in the scientific and human elements of data center operations.

    • Computing & IT
      July 2003

      Building an Electronic Records Archive at the National Archives and Records Administration

      Recommendations for Initial Development

      by Committee on Digital Archiving and the National Archives and Records Administration, Robert F. Sproull and Jon Eisenberg, Editors, National Research Council

      Like its constituent agencies and other organizations, the federal government generates and increasingly saves a large and growing fraction of its records in electronic form. Recognizing the greater and greater importance of these electronic records for its mission of preserving "essential evidence," the National Archives and Records Administration (NARA) launched a major new initiative, the Electronic Records Archives (ERA). NARA plans to commence the initial procurement for a production-quality ERA in 2003 and has started a process of defining the desired capabilities and requirements for the system. As part of its preparations for an initial ERA procurement, NARA asked the National Academies' Computer Science and Telecommunications Board (CSTB) to provide independent technical advice on the design of an electronic records archive, including an assessment of how work sponsored by NARA at the San Diego Supercomputer Center (SDSC) helps inform the ERA design and what key issues should be considered in ERA's design and operation.Building an Electronic Records Archie at the National Archives and Records Administration provides preliminary feedback to NARA on lessons it should take from the SDSC work and identifies key ERA design issues that should be addressed as the ERA procurement process proceeds in 2003.

    Subscribe to our newsletter