Tech
Gigabyte Aorus 17 review: Lots of oomph with lower-key specs
At a glance
Expert’s Rating
Pros
- Solid performance
- Big, high refresh rate display
- Great keyboard
Cons
- Low screen brightness
- Giant trackpad can be finnicky
- Bulky DC jack takes up mousing space for righties
Our Verdict
Despite slightly underwhelming specs, the Aorus 17 provides a high-performance gaming experience with a side of a high refresh rate.
Price When Reviewed
$3,299.98
At first glance, the pricey, powerful new Gigabyte Aorus 17 is a bit of an odd bird. Boasting neither the most powerful Nvidia RTX graphics nor the highest-end Intel Core i9 processor, this seems like it might be a shoo-in for video editing and design work. With a 90% screen-to-body ratio, the footprint of this device would have been in the 15-inch class only a few years ago, making it TARDIS-like in its bigger-on-the-inside feeling.
The styling leans towards gamer chic, from its sharp angles to its RGB programmable keyboard to its ports. And then there’s the screen—Gigabyte put in a gloriously fast 1080p 360Hz IPS panel. You won’t be editing native 4K video on this screen, but if you’re a gamer who demands high high refresh rates on the go, the Aorus 17 is a competitive entry.
Specs and features
Our review unit has an Intel Core i7-12700H processor and a Nvidia GeForce RTX 3080 Ti GPU. This combination makes it quite a capable gaming machine. It’s packing 32GB of RAM and a 1TB PCIe NVMe SSD as well. The display is a 17.3-inch HD resolution screen with a high 360Hz refresh rate. For further details, take a look at the spec list below.
- CPU: Intel 12th Generation Core i7-12700H (6P, 8E cores)
- RAM: 32GB DDR5-4800 in dual-channel mode
- GPU: Nvidia GeForce RTX 3080 Ti @ 130W GDP
- Screen: 17.3-inch, 16:9, HD resolution display with 360Hz refresh
- Storage: 1 TB Gigabyte AG470 PCIe Gen4 NVMe SSD
- Networking: Wi-Fi 6E with Intel AX211, 2.5G ethernet
- Size and weight: 15.7(W) x 10(D) x 0.94 x 1.1(H) inch, 5.95 lbs plus 1.4 lbs for power brick
- Battery size: 99Wh
- Ports: Two USB-A 3.1, one Thunderbolt 4, mini DisplayPort 1.4, HDMI 2.1, Ethernet, analog headset jack, DC-in
- Price: $3,299.95
Keyboard and trackpad
IDG / Brendan Nystedt
The keyboard in this big 17-inch Gigabyte is surprisingly good and not just for gaming, either. With full-sized keys on the numpad and natural key spacing, I feel like this could be used to power through spreadsheets and term papers as well as anything. This keyboard makes the Aorus 17 a great pick for anyone who has to juggle numbers frequently. If I were to pick any nit, it’d be the slightly shrunken right shift key, which was a bit of an annoyance, but one I adapted to with practice.
The per-key RGB on the Aorus 17 is pretty stunning with some impressive built-in effects and super smooth transitions. Of course, it’s a quick key combo (fn + space) to flick it on or off as the circumstances require.
IDG / Brendan Nystedt
The trackpad, on the other hand, is a bit fiddly. It’s a big, MacBook-sized number with the Windows Precision driver. Sensitivity is good and gestures work well. But, for my use patterns, I found it was far too large and I brushed it with my palms quite often, dislodging the cursor when I wasn’t typing. I had a much easier time after tuning sensitivity and turning off features like tap-to-click.
Audio subsystem
Despite its compact-for-a-17-inch billing, the Aorus 17 still makes room for some decent-sounding speakers. The DTS:X powered speakers can ably play podcasts or music. Bass is audible and the mids and highs come through strong but you’re going to hear some distortion once you venture past the 75% volume mark. Obviously, if you’re hoping to crank in-game audio, the speakers have to compete with the loud fans. Unfortunately, the fans will win out. You’ll want to get some headphones!
Webcam
IDG / Brendan Nystedt
It used to mean that having a bezel-free screen meant putting up with a camera at an awkward angle. Gigabyte retained skinny bezels by putting the Aorus’s camera equipment up in a protruding lip—a nega-notch, if you will. This lets the HD camera be a bit bigger, pack in the requisite sensors for Windows Hello log-in, and even the microphones.
IDG / Brendan Nystedt
The camera does an acceptable job of putting your mug on the world wide web for all to see. It makes me look way more pasty than I actually am. Additionally, even in bright light, this 720p unit annihilates detail in a way that makes me think of a skin-smoothing beauty mode. It’s better than nothing (ahem, MSI), but it’s a far cry from what’s available as an external attachment.
Build quality and acoustics
Like many computers in the gaming category, the choice to stick with a matte black plastic does a lot for a laptop. Sure, it doesn’t feel as premium as aluminum or magnesium, but it’s durable, doesn’t get hot to the touch, and tends to age well. The only major downside is that like on most black gadgets, you’re likely to see fingerprints build up, in this case most noticeably on the Aorus 17’s keycaps and trackpad.
The Aorus feels solidly built, although its blend of thinness and big deck size definitely feels less rigid than something with a smaller footprint. Thankfully, the plastic chassis keeps the keyboard in check. Even when typing vigorously, I never noticed keyboard deck flex–you’ll have to press mighty hard to notice any kind of deformation of the top case.
IDG / Brendan Nystedt
IDG / Brendan Nystedt
Ports are placed on either side of the computer, with nothing along the rear edge. It’s nice to see a fully-featured Thunderbolt 4 port here, especially one that can use USB-C PD for charging. Unfortunately, your other charging option is a chunky right-angle DC barrel plug, which protrudes quite a bit on the right hand side. While southpaws will likely never notice, the majority of gamers might be annoyed. I certainly was, as it ate into valuable mousing area on a smaller desktop.
Part of the Aorus’s near-6 pound weight is due to its massive cooling system. Six copper heatpipes make sure heat is dissipated quickly and huge vents ensure airflow is never an issue. In one of its standard modes, you’ll definitely hear the fans whirring away thanks to a mild high-pitched whine that’s noticeable whenever there’s air on the move. Thankfully, if you activate the quiet/battery saver performance mode in the Gigabyte App, the Aorus 17 becomes as mild-mannered, well-behaved, and chill as an ultrabook.
Display
IDG / Brendan Nystedt
The headline feature here is Aorus’ 17.3-inch, 360Hz screen. While it’s only a 1080p panel, it’s IPS and has a claimed 72% NTSC color gamut. Viewing angles are solid with little to no color shift found off-axis.
I found that in more recent AAA games—like Halo Infinite and Forza Horizon 5—you’ll get nowhere near the top framerate out of this screen. If those are the kinds of games you play, it might make sense to go for something with a lower refresh rate, say 120Hz. But, if you’re into esports titles, you’re in for a real treat here. 360Hz in Rocket League was an eyeball-tickling experience on the Aorus 17!
If there’s any area where this screen is a letdown, it’s the brightness. We measured only 250 nits peak brightness, making it way dimmer than other laptops in this category. If you mostly use your computers indoors under controlled lighting, that might not matter. But if you’re hoping to enjoy this matte 17-inch screen outdoors or near windows, you’ll quickly find that there’s not quite enough brightness to cut through the glare.
Upgrades
Like many 12th gen Intel-based gaming laptops, the Aorus 17 has a bit of room to grow. Although it has a capacious 1 TB NVMe SSD as standard in this configuration, you’ll find another unpopulated M.2 slot waiting for more storage. The stock configuration has socketed ram as well, featuring 32 GB of DDR5-4800 RAM on two 16 GB DIMMs. I think that many people will never need much more RAM than that, but it might put you at ease knowing this machine can accommodate up to 64 gigs in the future.
Performance
You’re not only here to read about M.2 slots and DDR5…you probably want to know how this big laptop performs!
IDG / Ashley Biancuzzo
Let’s start with Maxon’s Cinebench R20, which uses the company’s Cinema4D engine to push the CPU to its limits. This test favors more powerful CPU cores. With 14 cores and 20 threads, the Gigabyte’s i7-12700H processor has plenty of oomph to handle multithreaded tasks. Although other laptops in this price class feature a slightly faster Core i9, the Aorus shows is no slouch.
IDG / Ashley Biancuzzo
So, you’ve got plenty of cores to push around. How about single-core performance? Thankfully, the R20 benchmark measures that as well. This benchmark give you an idea as to how the Aorus 17 will perform when using standard general computing apps like web browsers or Microsoft Office. As you can see in the graph above, the Gigabyte scores about the same as other laptops with Core i7 12th-gen processors. The MSI GE76 Raider has a little more power due to its Core i9 12th-gen processor.
IDG / Ashley Biancuzzo
Thanks to all its cores and ample cooling, the Aorus 17 chews through video encoding like a kid devouring celery and peanut butter at snack time. For our standard video encode test, we use Handbrake to crunch down a 1080p Blu-ray rip for use on an Android tablet. In the graph above, a shorter bar is a better result. The Aorus encoded a video in just 16 minutes, which is nothing short of impressive. These 12th gen machines are great for video encoding, making it perfect if you keep a Plex library as a hobby.
3D gaming performance
IDG / Ashley Biancuzzo
Let’s take a look at 3D gaming performance, shall we? To start, we ran the 3DMark Time Spy 1.2 benchmark. Thanks to its Nvidia RTX3080 GPU, the Aorus kept pace. However, it was marginally slower than similar models, likely because of its lower-than-average 130W TGP power envelope. The only recent machine to outperform the Aorus 17 was the MSI GE76 Raider, which has a faster-clocked Nvidia chip and an i9 processor to boot.
IDG / Ashley Biancuzzo
In our older Rise of the Tomb Raider benchmark, the Aorus 17 was more than up to the task. It scored 149 FPS in the 1080p/Very High test, which was only a bit lower than the other RTX3080 Ti-equipped machines.
IDG / Ashley Biancuzzo
In our intensive Metro Exodus benchmark, the Aorus came in a bit behind the pack, but not by much, showing that the slightly weaker GPU still can still put up a fight. We found that it didn’t rack up higher numbers when running in a dedicated gaming power mode—that’s because the computer’s default AI Boost rather deftly switches modes without user intervention. This is awesome because it’s one less thing users need to fiddle with. You’ll see great performance if you simply keep the computer’s default setting.
Battery life
Unfortunately, despite its big 99Whr battery, the powerful components and big display are a drain on the Aorus 17. During our standard battery rundown test, in which we run a 4K video on loop, the Aorus 17 barely made it to six hours of runtime. While you’ll probably get more time working out of the battery if you’re careful, don’t count on watching movies on a long-haul flight without your charger. Anecdotally, under a light workload, I lost around 10% every hour. That’s with a bunch of Edge tabs open, OneNote running, and a YouTube video playing in the background.
Conclusion
The latest Gigabyte Aorus is far from a disappointment. Although on paper it might not seem as impressive as its peers with its lower-wattage Nvidia GPU and i7 instead of an i9. In reality, this package still packs a wallop. Its big screen is a bit dim and the port layout’s a bit of a mess, but on the whole the Aorus 17 is a competitive laptop that shines when it comes to gaming. You’re getting a big screen, high refresh rate experience on the go.
Note: When you purchase something after clicking links in our articles, we may earn a small commission. Read our affiliate link policy for more details.
Tech
Why trusted execution environments will be integral to proof-of-stake blockchains
We are excited to bring Transform 2022 back in-person July 19 and virtually July 20 – 28. Join AI and data leaders for insightful talks and exciting networking opportunities. Register today!
Ever since the invention of Bitcoin, we have seen a tremendous outpouring of computer science creativity in the open community. Despite its obvious success, Bitcoin has several shortcomings. It is too slow, too expensive, the price is too volatile and the transactions are too public.
Various cryptocurrency projects in the public space have tried to solve these challenges. There is particular interest in the community to solve the scalability challenge. Bitcoin’s proof-of-work consensus algorithm supports only seven transactions per second throughput. Other blockchains such as Ethereum 1.0, which also relies on the proof-of-work consensus algorithm, also demonstrate mediocre performance. This has an adverse impact on transaction fees. Transaction fees vary with the amount of traffic on the network. Sometimes the fees may be lower than $1 and at other times higher than $50.
Proof-of-work blockchains are also very energy-intensive. As of this writing, the process of creating Bitcoin consumes around 91 terawatt-hours of electricity annually. This is more energy than used by Finland, a nation of about 5.5 million.
While there is a section of commentators that think of this as a necessary cost of protecting the entire financial system securely, rather than just the cost of running a digital payment system, there is another section that thinks that this cost could be done away with by developing proof-of-stake consensus protocols. Proof-of-stake consensus protocols also deliver much higher throughputs. Some blockchain projects are aiming at delivering upwards of 100,000 transactions per second. At this performance level, blockchains could rival centralized payment processors like Visa.

The shift toward proof-of-stake consensus is quite significant. Tendermint is a popular proof-of-stake consensus framework. Several projects such as Binance DEX, Oasis Network, Secret Network, Provenance Blockchain, and many more use the Tendermint framework. Ethereum is transitioning toward becoming a proof-of-stake-based network. Ethereum 2.0 is likely to launch in 2022 but already the network has over 300,000 validators. After Ethereum makes the transition, it is likely that several Ethereum Virtual Machine (EVM) based blockchains will follow suit. In addition, there are several non-EVM blockchains such as Cardano, Solana, Algorand, Tezos and Celo which use proof-of-stake consensus.
Proof-of-stake blockchains introduce new requirements
As proof-of-stake blockchains take hold, it is important to dig deeper into the changes that are unfolding.
First, there is no more “mining.” Instead, there is “staking.” Staking is a process of putting at stake the native blockchain currency to obtain the right to validate transactions. The staked cryptocurrency is made unusable for transactions, i.e., it cannot be used for making payments or interacting with smart contracts. Validators that stake cryptocurrency and process transactions earn a fraction of the fees that are paid by entities that submit transactions to the blockchain. Staking yields are often in the range of 5% to 15%.
Second, unlike proof-of-work, proof-of-stake is a voting-based consensus protocol. Once a validator stakes cryptocurrency, it is committing to staying online and voting on transactions. If for some reason, a substantial number of validators go offline, transaction processing would stop entirely. This is because a supermajority of votes are required to add new blocks to the blockchain. This is quite a departure from proof-of-work blockchains where miners could come and go as they pleased, and their long-term rewards would depend on the amount of work they did while participating in the consensus protocol. In proof-of-stake blockchains, validator nodes are penalized, and a part of their stake is taken away if they do not stay online and vote on transactions.

Third, in proof-of-work blockchains, if a miner misbehaves, for example, by trying to fork the blockchain, it ends up hurting itself. Mining on top of an incorrect block is a waste of effort. This is not true in proof-of-stake blockchains. If there is a fork in the blockchain, a validator node is in fact incentivized to support both the main chain and the fork. This is because there is always some small chance that the forked chain turns out to be the main chain in the long term.
Punishing blockchain misbehavior
Early proof-of-stake blockchains ignored this problem and relied on validator nodes participating in consensus without misbehaving. But this is not a good assumption to make in the long term and so newer designs introduce a concept called “slashing.” In case a validator node observes that another node has misbehaved, for example by voting for two separate blocks at the same height, then the observer can slash the malicious node. The slashed node loses part of its staked cryptocurrency. The magnitude of a slashed cryptocurrency depends on the specific blockchain. Each blockchain has its own rules.

Fourth, in proof-of-stake blockchains, misconfigurations can lead to slashing. A typical misconfiguration is one where multiple validators, which may be owned or operated by the same entity, end up using the same key for validating transactions. It is easy to see how this can lead to slashing.
Finally, early proof-of-stake blockchains had a hard limit on how many validators could participate in consensus. This is because each validator signs a block two times, once during the prepare phase of the protocol and once during the commit phase. These signatures add up and could take up quite a bit of space in the block. This meant that proof-of-stake blockchains were more centralized than proof-of-work blockchains. This is a grave issue for proponents of decentralization and consequently, newer proof-of-stake blockchains are shifting towards newer crypto systems that support signature aggregation. For example, the Boneh-Lynn-Shacham (BLS) cryptosystem supports signature aggregation. Using the BLS cryptosystem, thousands of signatures can be aggregated in such a way that the aggregated signature occupies the space of only a single signature.
How trusted execution environments can be integral to proof-of-stake blockchains
While the core philosophy of blockchains revolves around the concept of trustlessness, trusted execution environments can be integral to proof-of-stake blockchains.
Secure management of long-lived validator keys
For proof-of-stake blockchains, validator keys need to be managed securely. Ideally, such keys should never be available in clear text. They should be generated and used inside trusted execution environments. Also, trusted execution environments need to ensure disaster recovery, and high availability. They need to be always online to cater to the demands of validator nodes.
Secure execution of critical code
Trusted execution environments today are capable of more than secure key management. They can also be used to deploy critical code that operates with high integrity. In the case of proof-of-stake validators, it is important that conflicting messages are not signed. Signing conflicting messages can lead to economic penalties according to several proof-of-stake blockchain protocols. The code that tracks blockchain state and ensures that validators do not sign conflicting messages needs to be executed with high integrity.
Conclusions
The blockchain ecosystem is changing in very fundamental ways. There is a large shift toward using proof-of-stake consensus because it offers higher performance and a lower energy footprint as compared to a proof-of-work consensus algorithm. This is not an insignificant change.
Validator nodes must remain online and are penalized for going offline. Managing keys securely and always online is a challenge.
To make the protocol work at scale, several blockchains have introduced punishments for misbehavior. Validator nodes continue to suffer these punishments because of misconfigurations or malicious attacks on them. To retain the large-scale distributed nature of blockchains, new cryptosystems are being adopted. Trusted execution environments that offer disaster recovery, high availability, support new cryptosystems such as BLS and allow for the execution of custom code with high integrity are likely to be an integral part of this shift from proof-of-work to proof-of-stake blockchains.
Pralhad Deshpande, Ph.D., is senior solutions architect at Fortanix.
DataDecisionMakers
Welcome to the VentureBeat community!
DataDecisionMakers is where experts, including the technical people doing data work, can share data-related insights and innovation.
If you want to read about cutting-edge ideas and up-to-date information, best practices, and the future of data and data tech, join us at DataDecisionMakers.
You might even consider contributing an article of your own!
Tech
How NFTs in the metaverse can improve the value of physical assets in the real world
Interested in learning what’s next for the gaming industry? Join gaming executives to discuss emerging parts of the industry this October at GamesBeat Summit Next. Learn more.
The metaverse has become inseparable from Web3 culture. Companies are racing to put out their own metaverses, from small startups to Mark Cuban and, of course, Meta. Before companies race to put out a metaverse, it’s important to understand what the metaverse actually is.
Or what it should be.
The prefix “meta” generally means both ”self-referential” or “about.” In other words, a meta-level is something about a lower level. From dictionary.com:
“-a prefix added to the name of a subject and designating another subject that analyzes the original one but at a more abstract, higher level:
metaphilosophy; metalinguistics.
a prefix added to the name of something that consciously references or comments upon its own subject or features:
a meta-painting of an artist painting a canvas.
The key aspect of both definitions is self-reference. Logically, the term “metaverse” then should be “a universe that analyzes the original one, but at an abstracted level.” In other words, the metaverse will be an abstraction layer that describes our current physical world.
The metaverse should be an extended reality, not a whole new one.
And that’s why the trend has been heading toward a metaverse that’s built on crypto. Crypto, just like the world, has a kind of physical nature to it. You can’t copy a Bitcoin or an NFT. Just like the coffee cup on your desk can’t occupy the same physical space as the cup next to it. The space itself is singular and immutable and can’t be copied. Even if you make a 3D-printed replica, it’s not the same cup. So crypto is very well suited to building an immutable layer that describes the real world. In crypto, we can build models of the real world that carry over many of its properties.
The natural opportunity will be in digital twins. Digital twins create a universe of information about buildings or other physical assets and are tied to the physical world. In other words, they are that meta-layer. By integrating blockchain technology, in the form of NFTs, all data and information surrounding the physical twin can be verified and saved, forever, all tracked with the asset itself. When you think about it, digital twins are the metaverse versions of the physical twins, and the technology enhances features of the real world.
Validation is the key to metaverse truth
When evaluating crypto/blockchain’s relationship to the metaverse, it’s important to remember that crypto is about verification and validation. So when considering blockchain’s relationship to the metaverse, it makes sense to think about it as a digital space that can be validated.
So in the metaverse, it’s time to expand on what an NFT is and what it can hold. NFTs cannot be copied because they are tied to the validation and verification process in time, which is what makes them nonfungible. As the capabilities of NFTs grow, they are becoming a new information dimension that is tied to the real world.
NFT domains are going to be core to this idea. They become a nonfungible data space, uniquely tied to us and our activity on Web3. In the metaverse, these domain NFTs can represent a house; recording and validating every visitor, repair, event, etc. And that record and that infrastructure can be sold not just with the house but as a core component of the house, increasing the value.
By clearly defining what a true metaverse is, both for developers and investors, we can start to move toward a meaningful version of it.
Leonard Kish is cofounder of Cortex App, based on YouBase’s distributed date protocol.
DataDecisionMakers
Welcome to the VentureBeat community!
DataDecisionMakers is where experts, including the technical people doing data work, can share data-related insights and innovation.
If you want to read about cutting-edge ideas and up-to-date information, best practices, and the future of data and data tech, join us at DataDecisionMakers.
You might even consider contributing an article of your own!
Tech
Protecting the modern workforce requires a new approach to third-party security
We are excited to bring Transform 2022 back in-person July 19 and virtually July 20 – 28. Join AI and data leaders for insightful talks and exciting networking opportunities. Register today!
Ask any HR leader: they’ll tell you that attracting and retaining employees continues to be a top challenge. While this has never been easy, there’s little doubt that the COVID-19 pandemic (and distributed workforces) have made things even more complex. As you read this article, many workers are actively considering leaving their current roles, which don’t support their long-term goals or desired work-life balance. While organizations attempt to navigate this “Great Resignation,” more than 4 million workers are still resigning every month.
As 2022 marches on, hiring teams face another massive obstacle: global talent shortages. These trends have companies rushing to find creative stop-gap solutions to ensure business continuity in difficult times. It shouldn’t come as a surprise that more companies are relying on third-party vendors, suppliers and partners to meet short-term needs, reduce costs and keep innovation humming. In addition, the rise of the gig economy has more employees entering into nontraditional or temporary working relationships. This trend is particularly prevalent in the healthcare industry, but as many as 36% of American employees have a gig work arrangement in some form, either alongside or instead of a full-time job.
What’s more, the corporate supplier ecosystem has become exponentially more complex. Amidst the supply chain vulnerabilities revealed by the pandemic, organizations are expanding and diversifying the number of supplier relationships they’re engaging in. Meanwhile, regulators have stepped up efforts to manage these business ecosystems.
In many cases, outsourcing to temporary workers or external partners makes good business sense. Sometimes, given the constraints of the talent pool, there’s simply no other option for a company. Either way, organizations should be aware of the security risks that third parties bring — and the steps they can take to minimize the chances of a breach occurring.
Third-party security challenges remain prevalent
Bringing a third-party workforce onboard in a rushed way – and without proper governance or security controls in place – leaves organizations open to significant cyber risk. These risks can stem from the third-party users or suppliers themselves or those third parties’ access becoming compromised and used as a conduit for lateral movement, enabling attackers to access the company’s most sensitive data. Sadly, a lack of centralized control over suppliers and partners is all too common, no matter the industry. In many organizations, unlike full-time employees, third-party users are managed on an ad hoc basis by individual departments using manual processes or custom-built solutions. This is a recipe for increased cyber risk.
Take the now-infamous Target breach, which remains among the largest-scale third-party security breaches in history. In this incident, attackers made their way onto the retail giant’s network after compromising login credentials belonging to an employee of an HVAC contractor, eventually stealing 110 million customers’ payment information.
In today’s world, where outsourcing and remote work are now the norm, third parties require corporate network access to get their jobs done. If companies don’t reconsider third-party security controls – and take action by addressing the root of the problem – they’ll remain open to cyber vulnerabilities that can devastate their business and its reputation.
A pervasive lack of visibility and control
Although reliance on third-party workers and technology is widespread in nearly every industry (and in some, it’s common for an organization to have more third-party users than employees), most organizations still don’t know exactly how many third-party relationships they have. Even worse, most don’t even grasp precisely how many employees each vendor, supplier or partner brings into the relationship or their level of risk. According to one survey conducted by the Ponemon Institute, 66% of respondents have no idea how many third-party relationships their organization has, even though 61% of those surveyed had experienced a breach attributable to a third party.
Grasping the full extent of third-party access can be particularly challenging when there’s collaboration with outsiders through cloud-based applications like Slack, Microsoft Teams, Google Drive or Dropbox. Of course, the adoption of these platforms skyrocketed with the large-scale shift to remote and hybrid work that has come about over the last two years.
Another challenge is that although an organization may try to maintain a supplier database, it can be near-impossible to ensure that it’s both current and accurate with current technical capabilities. Because of processes like self-registration and guest invites, external identities remain disconnected from the security controls applied to employees.
Growing regulatory interest and contractual obligations
As incidents and breaches attributable to third parties continue to rise, regulators are taking notice. For instance, Sarbanes-Oxley (SOX) now includes several controls targeted explicitly at managing third-party risk. Even the Cybersecurity Maturity Model Certification (CMMC) explicitly targets improving the cybersecurity maturity of third parties that serve the federal government. The ultimate goal of such regulations is to bring all third-party access under the same compliance controls required for employees so that there’s consistency across the entire workforce and violations can be mitigated quickly.
Today, we expect companies to push their suppliers, vendors and partners to implement more stringent security controls. In the long run, however, such approaches are unsustainable, since it’s difficult, if not impossible, to enforce standards across a third-party organization. Hence, the focus will need to shift to ensuring that identity-based perimeters are robust enough to identify and manage threats that third parties may pose.
Currently, decentralized identity solutions are moving into the mainstream. As these technologies become more widely accepted, they’ll continue to mature. This will help many organizations streamline third-party management in the future. It will also assist companies on their journey toward zero trust-compatible identity postures. Incorporating ongoing security monitoring and implementing continuous identity verification systems will also become increasingly important.
Five steps to mitigate third-party risk today
Today’s challenges are complex but not unsolvable. Here are five steps organizations can take to improve third-party access governance over the short term.
1) Consolidate third-party management. This process can begin with finance and procurement. Anyone with any contract to provide services to any department in the company should be identified and cataloged in an authoritative system of record that includes information on the access privileges assigned to external users.
Security teams should test for stale accounts and deprovision any that are no longer needed or in use. In addition, they should assign sponsorship and joint accountability to third-party administrators.
2) Institute vetting and risk-aware onboarding processes. Both the organization and its supplier/vendor need to determine workflows for vetting and onboarding third-party users to ensure they are who they say they are — and that their onboarding process follows the principle of least privilege. Implementing a self-service portal where third-party users can request access and provide required documentation can smooth the path to productivity. Access decisions should be based on risk.
3) Define and refine policies and controls. The organization — and its vendors and suppliers — should continuously optimize policies and controls to identify potential violations and reduce false positives. Policies and controls must be tested periodically, and security teams should also review employees’ access. Over time, auto-remediation can minimize administrative overhead further.
4) Institute compliance controls for your entire workforce. Look for a third-party access governance solution that will enable consistency across employees and third-party users, especially since regulators increasingly require this. Having access to out-of-the-box compliance reports for SOX, GDPR, HIPAA and other relevant regulations makes it easier to enforce the appropriate controls and provide necessary audit documentation.
5) Implement privileged access management (PAM). Another critical step that organizations can take to boost their cybersecurity maturity is implementing a PAM solution. This will enable the organization to enforce least privileged access and zero-standing privilege automatically across all relevant accounts.
The world of work will never again look like it did in 2019. The flexibility, agility and access to first-rate talent that businesses gain from embracing modern ways of working make the changes more than worthwhile. And enterprises can realize enormous value within today’s complex and dynamic business relationship and supplier ecosystems. They need to ensure their cybersecurity strategies can keep up by strengthening identity and third-party access governance.
Paul Mezzera is VP of Strategy at Saviynt.
DataDecisionMakers
Welcome to the VentureBeat community!
DataDecisionMakers is where experts, including the technical people doing data work, can share data-related insights and innovation.
If you want to read about cutting-edge ideas and up-to-date information, best practices, and the future of data and data tech, join us at DataDecisionMakers.
You might even consider contributing an article of your own!
-
Ethereum3 months ago
Michael Saylor’s MacroStrategy Secures $205 Million Loan to Buy More Bitcoin
-
Bit Coin3 months ago
Nifty Gateway Partners With Samsung to Develop ‘First-Ever Smart TV NFT Platform’
-
Tech3 months ago
How ‘eQuad’ Electric Bikes Could Change UPS Delivery
-
Bit Coin3 months ago
Bitcoin suddenly dives to $46K as attention focuses on large CME futures gaps
-
Bit Coin3 months ago
Binance launches Binance Bridge 2.0 to integrate CeFi and DeFi
-
Cryptocurrency3 months ago
Layer-2 aggregator platform Coinweb receives crypto exchange license from Lithuanian regulator
-
Bit Coin3 months ago
Axie Infinity hacked for $612M, OpenSea expands support to Solana, EU’s unhosted wallet regulations cause a stir: Hodler’s Digest, March 27-April 2
-
Bit Coin3 months ago
Defiance CEO ‘Completely Bullish on Bitcoin’ — Says It’s ‘a Good Time to Get in’