Posts

Showing posts from November, 2018

Why We Desperately Need Better Cybersecurity

The Internet of Things is an idea of potentially unending consequence and infinite possibility.    Essentially, it is the drive to make every device in our everyday lives communicate with other devices over the internet. It would mean that your entire house can be controlled with your phone and, one day hopefully, your entire life. Everything from your car to your refrigerator will be able to communicate, not only with your phone but other devices and servers all over the world.    New problems are presented by the Internet of Things The potential of the Internet of Things, as you might assume, is positively staggering as an entirely interconnected world would mean unprecedented access to data that can be used to shape the future. It is a goal towards uniform access to the internet and the ability to communicate with other people all over the world.    To create a world above the physical, to make an internet without borders. It is the dre...

How Tech Makes Investing More Accessible

New technology is revolutionizing what the average person can do with money. We’re seeing the rise of new types of investments, higher levels of accessibility for existing investments, and more efficiency, which ultimately leads to even further consumer engagement. So how, exactly, is this accessibility improving, and where can it go from here?   Brokerage Platforms   The emergence of online brokerage platforms completely changed the game for investing in stocks and bonds. While this hasn’t been good news for stock brokers in the financial services industry , it has made an otherwise complicated and confusing method of investing more accessible. Modern platforms allow average people to place trades with a single click, and some are even able to offer low- or no-cost trades, such as Robinhood’s famous “free trade” model.   Real Estate   New technology has also made real estate investing more accessible. Historically, real estate investors have bee...

What is Omni-Channel? 10 Examples of Brands Providing an Excellent Omni-Channel Experience

Technology has permanently changed the way brands interact with customers. Although shoppers have a multitude of options when it comes to buying products, they still prize the in-store experience. Instead of replacing brick-and-mortars, though, technology has created an opportunity for customers to move from store to website to phone, chat, and text message, all at their own convenience. This has driven a need for brands to start investing in business management tools that can help them reach customers wherever they are, whenever they have questions or need help.   If you’re still trying to come up with ways to move your own business to the next level, you don’t have to go it alone. You can take inspiration from some of the top brands in omnichannel, each one innovating their respective industries by emphasizing customer experience.   Disney   Leave it to Disney to lead the pack when it comes to making customers happy. Guests to the Magical Kingdom have...

Careers in Healthcare Technology: Advice from an Expert

For this month’s career feature, we interviewed Oliver Amft who authored "How Wearable Computing Is Shaping Digital Health" in the January-March 2018 issue of IEEE Pervasive Computing . Amft is the founding director of the Chair of eHealth and mHealth at the Friedrich Alexander University Erlangen-Nuremberg (FAU), where, since 2017, he has been full professor. Amft coordinated European research consortia such as GreenerBuildings and iCareNet, and is a principal investigator for several other European and national projects. He has co-authored more than 150 refereed archival research publications in the fields of context recognition, biomedical sensor technology, wearable computing, digital health, and embedded systems. We asked Amft about careers in healthcare technology.   Computing Now: What types of tech advances in the field of healthcare technology will see the most growth in the next several years?   Amft: I believe there will be two key areas: (1) Novel ...

The Many Roles and Names of the GPU Its versatility has led it to multiple and various platforms

Image
The original use and development for the GPU was to accelerate 3D games and rendering. The acceleration of the game’s 3D models involved geometry processing, matrix math, and sorting. Rendering involved polishing pixels and hiding some of them. Two distinctive, non-complimentary tasks, but both served admirably by a high-speed parallel processor configured as a SIMD—same instruction, multiple data, architecture. The processors where used in shading applications and became known as Shaders. Those GPUs were applied to graphics add-in boards (AIBs) and served their users very well. SIMD is the architectural design, GPU is the branding name, just like we have an x86 CPU (brand) which is a CISC architecture, and an Arm CPU (brand) RISC architecture.   It didn’t take long for the mass-produced GPU, which enjoyed the same economy of scale the ubiquitous x86 processor did, to be recognized as a highly cost-effective processor with massive compute-density. As such it was a...

Five Key Hybrid IT Tips and Putting Hybrid IT to Work for You

The hybrid IT environment is here to stay. But many organizations still haven’t been able to grasp the essential benefits of uniting a mix of workloads that live on premises, in the cloud, on the edge, and/or in co-location.    Whether you’re eager to extend your data center into the cloud for increased capacity and disaster recovery, or you simply want certain applications to reside in the cloud while others on-prem for compliance and cost reasons, there are key things you can do to fully leverage hybrid IT. Here are five tips to get the most out of this multi-source environment:   1. Knowing Why is as Important as Knowing How   Just because you can doesn’t mean you should. Before jumping into hybrid IT, do an appraisal of your business goals and decide what you expect your hybrid IT environment to do. Don’t just start selecting a bunch of cloud services and begin using them. “Blueprint” your enterprise’s strategic IT plan for half a decade into t...

Will Education be the Next Big Market for Electronic Displays?

For many of us, going to school meant filling backpacks and school lockers with big piles of books and binders overflowing with notebooks. Even today, most schools rely on traditional textbooks that are frequently outdated by the time they’re researched, written, printed and circulated. Budget-strained schools are faced with buying expensive, updated books every few years. Besides offering limited, often obsolete information, books usually only capture the thoughts of a single author. While this is not a big issue for subjects like Math or Chemistry, it is a significant problem when the topic is business or technology. Fortunately, help is at hand.   Satisfying the Need for Diverse Up-to-Date Content   The move is on to finally give students access to current, diverse content via text, video, and audio. Educators worldwide and those in developing countries are eager for their students to access the vast library systems of the first world, to the classroom cont...

Best 35 Developer Job Posting Sites for Employers in 2018

Developers are in high demand. If you’ve decided to hire top developer talent, you’ll have to fight an immense amount of competition. You have the best chance of hiring the most talented developers when you use trusted websites that save you both time and money.   From JavaScript experts to freelance mobile developers, there is a website out there that caters to employers looking for reliable technical talent. Unfortunately, there are also many popular job posting sites for employers that may end up wasting your time -- either because they lack features or fail to attract the best in the business.    Your company needs the best job posting sites for employers that can connect you with the best tech talent. The 35 developer job posting sites listed below are essential for employers looking to hire developers in 2018: Dice Dice is a technology-focused job board that has connections to the world’s largest tech firms. Their data analytics software allows em...

Scattered Information? Try These Techniques To Rope In Wandering Data

Manually gathering data from multiple sources takes time, yet it’s a necessary task when data analysis drives your decision-making. With data coming from multiple sources and programs that don’t automatically talk to each other, processing data can be an arduous task.   The promise of big data is insight into customer experiences to improve products and services. To gain those insights, you need an air-tight organizational system for capturing and managing your data. The promise of big data can only be realized when disparate data sources are integrated.   The first step is to de-clutter; stop collecting data you don’t plan to make available to decision-makers. The second step is to make sure the data you collect is accessible. The final step is to implement company-wide policies to maintain the integrity of your data.   Create an organized foundation: be selective with the data you collect   The amount of data you could collect is infinite, but th...

GPU History: Hitachi ARTC HD63484 The second graphics processor

Image
With the advent of large-scale integrated circuits coming into their own in the late 1970s and early 1980s, fueling the PC revolution and several other developments, came a succession of remarkably powerful graphics controllers. NEC introduced the first LSI fully integrated graphics chip in 1982 with the NEC ยต7220, and it was wildly successful finding its way into graphics terminals and workstations, but not PCs built by IBM. It did get used quite extensively by aftermarket suppliers.   Hitachi did NEC one better and introduced their HD63484 ACRTC Advanced CRT Controller chip in 1984. It could support a resolution up to 4096 × 4096 in a 1-bit mode within a 2 Mbyte display (frame) memory. The ACRTC also proved to be very popular and found a home in dozens of products from terminals to PC graphics boards. However, these chips, pioneers of commodity graphics controllers, were just 2D drawing engines with some built in font generation. That same year IBM introduced the...

How to Make Your PC Run Faster

If you’ve had your computer for more than a year or two, you’ve likely noticed its basic functions slowing down. There are many reasons for this, including the excessive (and increasing) number of files bogging the system down, and bugs in your operating system. Some of these factors can be mitigated or prevented, while others are just a natural part of a computer’s lifecycle.   Fortunately, there are a few important changes you can make to encourage your PC to run faster.   When to Replace Your PC   Note that while the following strategies can be used to make your PC run faster, they can only do so much. If your computer is several years old and has been subject to heavy downloading and installation, even the best strategies may only marginally improve your performance. At that point, it may be time to start shopping for deals on computers, so you can replace your unit entirely.   Strategies for Faster Computing   Try these tactics to make yo...

Famous Graphics Chips: EGA to VGA

Image
Famous Graphics Chips: EGA to VGA The initiation of bit-mapped graphics and the chip clone wars   When IBM introduced the Intel 8080-based Personal Computer (PC) in 1981, it was equipped with an add-in board (AIB) called the Color Graphics Adaptor (CGA). The CGA AIB had 16 kilobytes of video memory and could drive either an NTSC-TV monitor or a dedicated 4-bit RGB CRT monitor, such as the IBM 5153 color display. It didn’t have a dedicated controller and was assembled using a half dozen LSI chips. The large chip in the center is a CRT timing controller (CRTC), typically a Motorola MC6845.   Figure 1: IBM’s CGA Add-in board (hiteched)   Those AIBs were over 33 cm (13-inches) long and 10.7 cm tall (4.2-in). IBM introduced the second-generation Enhanced Graphics Adapter (EGA) in 1984 that superseded and exceeded the capabilities of the CGA. The EGA was then superseded by the VGA standard in 1987.       Figure 2: IBM’s EGA Add-in board — n...

7 Tips for Faster 3D Rendering

3D rendering is a miracle of modern technology, capable of everything from creating lavish gaming experiences to simulating real-world environments for businesses. Unfortunately, your setup might suffer from lag, or delays that make it aggravating to render anything—but there are some simple changes that can improve your performance.    Why Is 3D Rendering So Resource-Intensive?   Regardless of your application, 3D rendering is incredibly resource-intensive. This is partially because 3D rendering demands multiple components operating in unison, including your graphics cards, your RAM, your hard drive, and of course, the software you’re using. If even one of these components is off, your rendering speed could be negatively affected.   The problem is complicated by the fact that 3D rendering contains so much depth. Intuitively, you know that 3D rendering is more resource-intensive than 2D rendering because it multiplies your graphical needs by anoth...

Excessive Screen Time: Not a Problem Specific to Kids

Everywhere you look there are kids – from toddlers to teenagers – with noses buried in screens. Whether it’s a computer, tablet, smartphone, or TV, our youth are spending way too much time sitting in front of screens. But this isn’t a problem reserved for kids.    The Universal Problem of Excessive Screen Time   Whether it’s a tablet, smartphone, or video game console, even today’s youngest children find themselves constantly connected – glued to screens and addicted to the release they provide.   According to a research study that examined nearly 900 children between the ages of six months and two years, children who use handheld screens before they begin to talk may be at a higher risk for speech delays.   “By their 18-month check-ups, 20 percent of the children had daily average handheld device use of 28 minutes, as reported by their parents,” ASHA explains. “Using a screening tool for language delay, researchers found that the more handhe...