What Is Electronic Discovery Reference Model (EDRM)?

TechDogs Avatar

Imagine you're a chef, and you have to make a cake. You've got all the ingredients, but they're all mixed up in a big bowl. Your sous chef asks you how much and what ingredients you have put inside the mixing bowl because he just missed it. What do you do? You take them out of the bowl one by one and start measuring them, and that's precisely what EDRM does. EDRM is an electronic discovery reference model that breaks down electronic data into its constituent parts and then provides a set of procedures for handling those parts in the most cost-effective way possible. If you need to know how much flour is in your mix, how can you know how much water to add to it? If you need to know what flour it is (flour from wheat or rice or something else), how can you know how much water to add? So EDRM works by breaking down electronic data into its constituent parts, emails, documents, and spreadsheets and giving instructions on handling each piece. It also guides how to handle different types of information: social media posts, podcasts, voice recordings… The most important thing about EDRM is that it ensures that data is dealt with efficiently without sacrificing quality. The Electronic Discovery Reference Model (EDRM) is a way of planning and executing legal discovery in an electronic format. It was developed by Tom Gelbmann and George Socha in 2005, and it's used by electronic data providers and consumers when electronic data is gathered and assimilated as part of the legal process. So, what does this mean? It means that EDRM is used by anyone involved in gathering evidence electronically, whether they're looking for it or trying to protect themselves from evidence gathered against them.

TechDogs

Related Terms by Data Management

Data Vaulting

Data vaulting is like having a super-secret, ultra-safe subterranean vault to keep your precious data. A data vault is used to preserve valuable data in the same manner as a traditional vault to store valuable items such as gems and money. In the field of computer science, "data vaulting" refers to the practice of backing up and storing data in an off-site location that is both safe and distant. This helps to secure the data against calamities such as fire or water as well as theft, much like a vault protects precious things from theft and other threats. Data vaulting's "off-site storage" is crucial. Off-site storage is like storing essential assets in a vault in a distant city from where they are used. This helps safeguard the data from calamities that may occur locally and minimizes the likelihood of losing data. The phrase "incremental backup" is another significant and crucial technical buzzword. Instead of backing up the complete data set, incremental backup copies only the parts of the data that have been modified since the last time it was backed up. This helps save time and storage space, just as you only need to store newly valuable goods in the vault rather than all of the items each time, similar to how you only need to store freshly valuable items in the vault. Safeguarding data in a data vault is an essential part of any disaster recovery and business continuity strategy. Data vaulting helps safeguard precious data in the same way as a vault protects valuable objects. This enables businesses to swiftly recover from disasters and reduces the amount of downtime they experience. Consequently, consider using data vaulting to ensure your sensitive information's safety! It is comparable to possessing a top-secret, extremely secure, underground vault for sensitive data, replete with off-site storage and incremental backups. Have faith in us; your data will be grateful.

...See More

Data Brokering

Data brokering has hit the big time. It's creating buzz, controversy, and even a few scandals as companies mine and sell information about how people spend money. Don't you have to hand your wallet to a data broker? There are many ways to Defense yourself from these businesses who want your information and earn profits from it later. Data brokering is a collaborative process involving the right data sets to address a business problem. It requires expertise, domain knowledge, and the ability to navigate different datasets to find the ones that have the information needed to solve a particular issue. It may require data cleansing to make the information most valuable and easily understood. Providers make their data available to other businesses in the data brokering model. Data consumers can search for data that meets their requirements. Once the information is selected, it is downloaded and used for a specific business objective. Data brokering is a collaborative process across industries, countries, and cultures. Companies that offer data to other businesses are called data brokers. Data brokers must consider the laws and regulations that apply to their data. They must also consider the technical requirements of the data consumers using different systems and technologies. Data brokers must also create a system that enables other businesses to access their offered data. Data brokering is not a one-time process; it is an ongoing process that requires continuous updating and maintenance. The report then provides examples of data brokering, such as purchasing data from a pollster and selling it to a political campaign. The rest of the piece explores the implications of data brokering for businesses and consumers and some real-life examples of data brokering in practice. Most of us must realize how much our data is collected, sold, and used by companies to formulate targeted advertising. A glimpse into the lucrative world of data brokering illustrates how much information about us is being bought and sold.

...See More

Digital Video Broadcasting-Satellite Second Generation (DVB-S2)

There is a new standard in town, and the digital standard is here to stay. Digital Video Broadcasting-Satellite Second Generation (DVB-S2) has been around since 2003, and it's finally picking up steam with broadcasters and consumers alike. Its predecessor, DVB-S (the first generation of satellite digital television), debuted in the mid-'90s and was formally adopted in 1998 by the European Telecommunications Standards Institute (ETSI). Now we're moving on to the next generation of digital broadcasting: DVB-S2. What makes this new technology so unique? It's not just faster than the old one; it's also more reliable and flexible. You can expect higher data rates, better channel capacity, improved error correction capabilities and, most importantly, better picture quality! The DVB-S2 standard provides specifications for delivering high-definition and ultra-high-definition television (HDTV and UHDTV) video and audio over satellite and cable networks in the form of a standardized "satellite box" or set-top box or a high-end residential gateway. The standard is designed to be extensible to deliver new services such as 3G/ LTE mobile, IPTV, and OTT content. The DVB-S2 standard was ratified by the ETSI in March 2005 and published in October of that year. The measure was expected to be implemented in equipment by manufacturers in the second quarter of 2006. The DVB-S is like the first-generation iPhone. It was revolutionary, changed everything, and everyone wanted to get their hands on it. Then, after about a decade of using that same old technology, we were ready for something new: the second-generation iPhone with the glass screen and facial recognition that makes all your friends jealous when they see you using it. The DVB-S2 is beautiful, sleek, and fast like that second-generation iPhone. The only problem is that it needs to be entirely out (like the second-generation iPhone).

...See More

Join Our Newsletter

Get weekly news, engaging articles, and career tips-all free!

By subscribing to our newsletter, you're cool with our terms and conditions and agree to our Privacy Policy.

  • Dark
  • Light