The highs and lows of pop culture in 2017

With the new year upon us, it is almost compulsory to reflect on what has happened in this past year. Culture in history has always been a solemn event. No other generation has seen as much humour depicted from events to commercials. While the internet has liberated the world and gave us constant communication with the world around us by tearing down walls of distance and time zones, it has also allowed culture to be shared and absorbed, creating a more united and understanding planet though jokes and sarcasm – the pinnacle of human evolution.

The year started with no artist willing to headline for the inauguration of the current president of America and former reality TV star, Donald Trump, best known for The Apprentice. During the Golden Globes, Meryl Streep took the opportunity to call the to-be president out on his blatant disrespect for Serge Kovaleski, a disabled reporter. But that was not all that happened – as the event was coming to a close, the announcement for Best Picture was mixed up between La La Land and Moonlight, reminiscent of the Miss Universe mix up back in 2015.

Later in May, president Trump managed to unite the nation with a misspelling of the word ‘coverage’ which posted onto his twitter account: covfefe. Around the same time, a teenager roped in a year’s worth of chicken nuggets through the same social media platform from Wendy’s and snagged the most retweeted tweet of all time, a title previously held by Ellen Degeneres with a little celebrity help and a push from giant companies like Google and Amazon. The publicity surrounding his tweet also earned a donation of $100,000 to The Dave Thomas Foundation for Adoption.

However, it was not all fun and satire in 2017, given the acts of terror that plagued the world, even at concerts. One notable bombing took place during Ariana Grande’s benefit concert. 22 people were killed in the blast, with over a hundred injured. Undaunted, the songstress hosted another concert just two weeks later in the same arena. Joined by fifty thousand people along with other big names like Black Eyed Peas, Miley Cyrus and Justin Bieber, they raised a total of $2.5 million.

The power of celebrity reignited a movement for victims of sexual harassment to step forward, beginning with Taylor Swift’s groping incident four years ago which was intended to be kept from public knowledge as her mother felt there was no need to put the artist in a vulnerable position. However, following Swift’s complaints, the radio host who harassed her was fired and being unable to find work decided to sue her for defamation and $3 million dollars in damages.

She countersued for one dollar in an effort to serve as “an example to other women who may resist publicly reliving similar outrageous and humiliating acts” and to show that she is not looking for financial gain or to benefit from her accusations. With or without her win, she landed her on the cover of Time magazine as a silence breaker.

Taylor Swift also addressed the criticisms she has had to endure by making a comeback and dropping the fastest selling album in 2017. Her first single, Look What You Made Me Do, referenced her feud with Kanye West, Katy Perry and the super publicized and ridiculed romance with Tom Hiddleston, among other things. The music video has been hailed as a masterpiece, from its clever mise en scene to the costumes, effectively taking back her identity with imagery.

Many fans also speculated that she placed her entire discography on Spotify, despite her strong opinions regarding the rights of artists and that their work should never be distributed for free, in an attempt to undermine Katy Perry’s album – due out the same day Taylor Swift decided to place all her songs on the music streaming platform.

2017 also saw an increase in interest towards bronze sculptures; two in particular. The first one was sculpted by Emanuel Santos and put on display at the renaming of a Portuguese airport. It went viral, generating thousands upon thousands of reactions from netizens all around the world. Some called it a work of art, and others made a meme out of it. The bust in question is one fashioned after soccer star, Cristiano Ronaldo. In an attempt to defend himself, the sculptor opened up to The Guardian, saying that he got the soccer player’s seal of approval for the sculpture which will be on permanent display outside the terminal entrance. He even went on to say that “it is impossible to please the Greeks and Trojans. Neither did Jesus please everyone”.

The second bronze sculpture to catch the world’s attention was created by Jose Antonio Navarro Arteaga, and was revealed last month at Santiago Bernabéu Stadium in Madrid, Spain. Fans quickly gave their approval and were satisfied that it captured the face and essence of their beloved star.

In that same vein, following the failed success of Batman vs Superman and other movies from the DC universe, Wonderwoman has surpassed all expectations. Many are calling it an advocacy for women rights and empowering, as it is the only film in history to be helmed by a woman director and given such a huge budget.

Here’s to another year filled with great mistakes, memes and memorable moments.

Watch Your Eyes

It’s an undeniable fact that computers, whether they be in the form of a mobile phone or a personal laptop, are an essential engine in our everyday commercial and personal life. While those spiritually inclined may argue the benefits of keeping computers at an arm’s length, it only takes one look at the plethora of transactions conducted every second of everyday, whether online or in-store, to realise that usage of computers are quintessential. How else would the banking and financial services sector operate? On accounting slips of paper? The tide of the 21st century has influenced the daily habits of an average worker, and the performance of an increasing number of jobs requires at least some use of a computer screen, whether you are a cashier operator, a receptionist or an accountant.

But what is the effect of the increasing number of hours we are spending on computers to our eyes? Study shows that it’s a serious and multiplying health problem. According to Vision Council, a representative company in the optical industry, up to 95 percent of Americans spend two or more hours each day using a digital device. That’s a gargantuan proportion of America’s 323 million people exposed to the risk of digital eye strain. Researchers even have a particular term for it, an umbrella term labelled CVS – Computer Vision Syndrome. And further studies have shown that 50% to 90% of people who work with a computer screen have some symptom of eye trouble.

What are these symptoms? The average person blinks around 18 times per minute, a process which refreshes the eyes naturally by frequently allowing it to become re-moisturised with the inner lining. But remember that last Excel spreadsheet you were typing out for your boss? Or that last marketing PowerPoint you were finishing in the dawning hours of the morning? These blink rates are reduced when we stare at a computer screen for too long, which inflames our eyes and causes them to burn, dry out, become red or irritate.

If you’ve ever rubbed your eyes to get rid of the grittiness, know that it’s a band-aid solution. There is the deeper underlying problem that the structure of your eyes is slowly being changed. When we read the screen closely, two harmful effects emerge. Firstly, our peripheral visual field is not focussed, and the eyeball tries to grow larger to compensate for this, contributing to short-sightedness. Secondly, the constant contraction of our optical muscles when focussed on a screen up close makes the eyeball more elongated. There are miniscule muscle structures within the eye that change the lens in-order to bring parts of what you’re reading into focus. Hours of focus on a single point, which oftentimes unconsciously happens, will fatigue these muscles to the extent that the eyes can no longer focus.

Risk of eye strain is generational, the younger a person’s age, the higher the likelihood that they’re reliance on technology is. Think of the last time you’re hand absent-mindedly swept to your pocket to check your friends’ latest Instagram post or LinkedIn update. Further, companies are offering unlimited entertainment options to stream favourite movies or TV shows without using the traditionally billed Internet data on customers’ phone plans. All conducive to increasing the risks of the longer term damage caused by Computer Vision Syndrome. But putting aside the fun, computer use is essential for the commercial functioning of society, so how can we combat the increasing strain we are putting on our eyes?

Let’s start with the 20/20/20 rule. It is a method to mitigate eye discomfort, imagine a mental picture of your allowing your eyes’ lens’ a rest after 20 minutes of computer screen usage. You need to take a 20 second break and stare at something at least 20 feet away. The size of your computer font matters, the smaller it is, the more you strain your eyes focussing to decipher. It’s difficult to enlarge the font, comically to the point of what it would appear on laptops for seniors, as enhanced productivity oftentimes requires fitting more words on the screen, or multiple screens, at any point in working hours. But an increase in computer screen size would be beneficial, as it would provide a larger surface area to focus on, enlarging the areas of focus we have to contract our lens’ to focus on.

Essentially it’s a combination of factors on your digital display which will require tinkering over time to accommodate your eye state. The brightness should be approximately the same as your surrounding workstation. As an experiment, look at the plain white background of this webpage. If it is bright enough to be a source of light in your current environment’s lighting, it is too bright. Colour temperature is a further crucial factor. It is a technical term describing the spectrum of light emitting from your computer screen. You may have heard of blue light filters as a term revolving around mobile phone feature incorporation and eye glass offerings. These are beneficial in screening the short-wavelength visible light, that is, blue light, which is more harmful than those of longer wavelengths, that is hues of orange and red. Reducing the colour temperature reduces the amount of blue light emitted, and incorporation of blue light filters on your mobile phone or eyeglasses reduces your absorption.

 

Big Data, Health and Patient Privacy

The new wave of digitizing medical records has seen a paradigm shift in the healthcare industry. The ability to store patient records and research has meant the healthcare industry is witnessing a huge increase in the volume of data. As healthcare providers and insurance companies look for every feasible way to lower costs, and improve the delivery of care, the promise of big data is set to transform the industry from reactive to proactive. But there are still some concerns. Cybersecurity issues and patient privacy are potential complications that the health industry must solve if they want to take advantage of the benefits big data can provide.

Healthcare data, especially on the clinical side, has a long shelf life. Providers are legally required to keep patient data accessible for at least six years. But they may also wish to make use of de-identified datasets for research purposes. Data may also be reused or re-examined for quality measurement or performance benchmarking.

Despite the advantages big data brings to the healthcare industry, unauthorized disclosure of patients’ private health information remains a serious issue. Data breaches cost the industry somewhere around six billion dollars annually, and some statistics have the number of data breaches per year pinned at close to 50,000. A survey from the Ponemon institute found that ninety-seven percent of the healthcare organizations reported suffering a data breach within the previous two years. Causes for breaches range from lost and stolen equipment, to employee mistakes. But experts say it’s a trend that will continue as big data and the healthcare industry become more intertwined.

It’s understandable that data security has become a high-level priority for healthcare organizations, and with the number of hacks and breaches we’ve seen in 2017, it’s obvious that healthcare data is subject to an enormous range of vulnerabilities.

Doctors and clinic staff rarely think about where the data they access is being stored. As the volume of healthcare data grows exponentially, some providers find themselves unable to manage the costs and impacts of on premise data centres. Outsourcing services like medical billing to a dedicated company with its own secure servers and a security and risk team to manage them, is one way to provide patients with the peace of mind that their personal data is secure and safe.

Breaches in data security aren’t the only concern, however.

The practice of sharing patients ‘encrypted’ data for the purposes of research and clinical trials is not without its dangers.

2.9 million Australians recently had their private ‘encrypted’ health data pulled from the web after a report from the University of Melbourne showed that patients could be re-identified through a process as simple as linking the encrypted parts of the record with known information about the individual such as medical procedures they’ve undertaken, and year of birth. The researchers identified several high-profile Australians this way, including three Members of Parliament and a prominent Australian footballer. According to the report, this highlights the privacy risk inherent in sharing patient data, and illustrates the ways de-identification practices can fall short.

However, it gets into their hands, once cyber criminals have access to patient data, it can be used for malicious intent, and when we’re talking about data that doesn’t expire, such as social security numbers or dates of birth, it can be used repeatedly.

Generally medical records that are stolen are resold on the dark web, as either a complete record, or in piecemeal portions. And they sell for several times the amount that a stolen credit card number or a social security number does. This is because Cybercriminals can pick and choose what they want from the various parts of a patient’s record. This includes medical histories, test results, methods of payment, home addresses, credit card numbers, and birthdates, just to name a few.

Thieves can then can use the information from stolen electronic records to receive medical care, file fake insurance claims or gain access to prescription drugs. They can exploit identifiable data to obtain credit or take out loans, or even forge official government issued documents like passports.

The growing Internet of Things and the prevalence of tracking apps and wearable devices also pose security risks. The privacy safeguards for the consumer are basically non-existent. With the end-users themselves having little control over the use and distribution of the data gathered by these devices. And these types of devices are only going to become more wide-spread.

Blockchain has been heralded as one of the potential saviours of security in the healthcare industry. With some experts suggesting it has the potential to both keep health data private and secure, and reap the benefits of wearables and other connected medical devices. Because it doesn’t have a centralized point for hackers to attack, it provides an additional layer of security against the threat of a breach.

But the truth is big data for healthcare is largely uncharted territory, and the health sector has lagged behind other industries when it comes to making cybersecurity a priority. As big data continues to impact the way health care is provided, the industry is going to need to begin adopting strategies and practices to keep patients data private.

Can Science Create the Perfect Baby?

Imagine a world without any life threatening diseases and everyone has perfect health. A world of comfort, where everyone is beautiful and whole. It is a world that geneticists and scientists are working towards. Instead of finding cures, nip the bud prior to infancy.

Researchers have recently successfully repaired a mutation that causes a heart condition that might prove to be fatal in some cases, and is the leading cause of death in young athletes as their hearts give out. CRISPR, a genetic engineering tool, was able to modify the gene to eliminate it from manifesting.

Along with its success, worries of designer babies have appeared. However, the researchers behind the project says that those worries are unfounded. It is not yet possible to enhance babies, rather than treat an underlying problem. The team further elaborates by explaining the process of the reparation of the mutated gene. The DNA which was inserted to rewrite the mutation was not incorporated by the embryos, but rather, it is the mother’s DNA which was used to repaired the MYBPC3 mutation carried by the father’s sperm.

Robin Lovell-Badge, a biological scientist of The Francis Crick Institute, said as much, implying that it is not possible to add something to an embryo which was not already there. This means that parents might not be able to assure that their offspring will be Ivy League material or inherit talent and supermodel looks, but it is a step to eliminate any health issues their child might have.

While this idea of gene therapy is not original, the technology behind CRISPR allows for better control and is more effective at editing genomes but it is not without its consequences. Everything that has been fiddled with, has the potential to affect something else and therefore it is why the embryos used for the experiment will not be implanted, for fear that it may have some negative side-effects such as changing other genes.

In a study published in Scientific Reports by Dr Alan Sanders has found that there might be a correlation between homosexuality and genetics but studying men who identified to be gay or straight. Along with his team, he discovered two genes that were different in the two groups, identified as SLITRK5 and SLITRK6. This breakthrough could allow parents to test for the sexuality of their unborn child. However, Professor Robin Lovell-Badge was quoted saying: “Even if a gene variant does show some correlation with sexual orientation, this does not mean that the gene is in any way responsible for being gay.”

While this is still under unconfirmed territory, there is another practice which made waves back in 2016 whereby a baby was conceived with the DNA from three individuals to correct the mother’s faulty mitochondria which would lead to Leigh syndrome, a neurological disorder that eventually ends in respiratory failure.

A fertility clinic in New York went to Mexico in order to do the mitochondrial-replacement procedure as the laws surrounding genetic modification were more lax in the latter country than in America at that time. The team replaced the nucleus of a donor’s egg with the mother’s – as it carries the bulk of a person’s DNA – and fertilized it with the father’s sperm. The baby was born with the nuclear DNA from his parents but mitochondrial DNA from the donor. The practice was not thoroughly tested at the time, but upon losing two prior children, the parents decided to go with it regardless.

However, it was legalized not long thereafter, starting from the UK and trickling down to America and Mexico. The only requirement for America is that the implanted embryos must be male as to minimize the risk of passing along faulty mitochondrial DNA to future generations.

It might seem to be the beginning of an utopia of perfectly healthy beings but there are many more disorders that require extensive study such as autism. There is no one specific gene responsible for causing it, and the underlying causes have been found in more than a thousand genomes. This is the same for other diseases such as certain cardiovascular diseases, cancer, Alzheimer’s and mental illness.

This is why it is currently impossible to create the perfect designer baby, with desirable IQ, physical appearance such as height and skin colour or the ability to run fast, jump high or lift heavy even without enhanced equipment like sprinter shoes or weight lifting gloves. These are all polygenic traits, meaning that it takes more than one gene to affect any one trait.

From the days of a monk and biologist in the mid-19th century, Gregor Mendel, who experimented with flower colours, genetics has come a long way. His discovery led James Watson and Francis Crick to the double helix of human DNA. We have since modified crops and cattle to meet the demands of population. Today, we making progressive leeway into perfecting humankind.

Selling Luxury

The Oxford English Dictionary defines luxury as follows. ‘A state of great comfort or elegance, especially when involving great expense.’ For most of the world, that definition is satisfactory. However, when it comes to the peculiar world of retail, luxury takes on an entirely new meaning. Stanley Marcus, one of the path breakers at luxury retailer Neiman Marcus had a more elegant definition. “Luxury is the best that the mind of man can imagine and the hand of man can create.” This definition fits perfectly with the idea that sits at the heart of all retail focused on selling luxury items.

For time in memorial, those with wealth have splurged disproportionate resources on things that they did not necessarily need. Yet the desire to have them remained undiminished if not elevated. On some occasions, these extravagant purchases were driven by the additional comfort that such items provide. A more comfortable car, a larger house, or even warmer beds. However, such purchase motives form a relatively small segment of the intentions that drive most luxury purchases.

Over the years, Luxury goods have become a representation of wealth rather than a commodity in themselves. Their value is derived not from the functional purpose the serve but from their ability to reflect a person’s status as one of affluence to others. The value being seen, arriving at an event in a top of the line automobile often outweighs the value of the sum of all its features. In the past these status symbols might have been diamonds and other precious stones, today they range from fashion accessories, cars, and sometimes even tech products.

With the high price these goods command, over time competition for the customers buying such goods has exploded. The ability to sell to these highly demanding customers comes at a cost. These customers expect nothing but the best. Whether it is a beautifully designed watch or an expansive yacht, the resources that go into earning a customer’s trust are considerable.

The organizations that have seen success have generally been the ones that have found a way to deliver a consistent, high quality product, over large periods of time. Sitting on top of the luxury market, in most segments, these heritage brands have become global powerhouses. While they do have incredibly large marketing budgets at their disposal, their value comes not from their media spends but the story they have crafted around the brand. The story that started out in the past but continues to evolve, over time appealing to each new generation of big spending customers. Their branding efforts have become shining examples for marketers all over the world.

While the digital revolution might have lowered the entry barrier into this segment of retail, there is no doubting the fact that the large established brands will continue to dominate. Still, the key to success in the filed remains the same as it has been since the first premium products were created. Passion for the craft that goes into creating them.

Today, it is important to note the impact the internet has had on the industry. While there is no doubting the fact that the market will continue to thrive, the rules that govern it might be changing. The experience created around owning most luxury products had been a big part of their appeal. Buying a dress was about going to the store, maybe sipping on some champagne while multiple dresses were brought out for one to try. Buying a car was about test driving multiple options and spending the day deciding on which car felt the best. There is a constant struggle to replicate these experiences in an increasingly digitized environment.

This is not the only struggle faced by luxury goods. For as long as they have existed, an old foe has constantly accompanied them into new markets, eating away into their margins. The counterfeit marketplace. The perception, these brands have built for themselves has found them falling victims to their own success. Fake imitation goods, resembling their image have flooded the markets thriving on a customer base that aspires to everything the brand stands for. However, these customers remain unable or unwilling to spend the money dictated by the price tags associated with the brand.

Certain experts in the field claim, having your brand replicated is an indication of success. For a luxury brand it is a marker that emphasizes that the brand has truly arrived. Still today, the regulatory framework is often helpless when it comes to curbing this illegal operation that consistently churns billions of dollars in business.

At the end of the day, people will always be willing to pay a premium for the quality and perception luxury goods offer. Whether they are fixtures in their homes, the clothes that they are seen in or toys that are just for pleasure, people with wealth will continue to accumulate luxury items to enrich their lives. The question, like always will remain. Who will win the battle to sell these items to them?

What Your Feet Are Trying to Tell You

Feet are one of the most important, and often neglected part of our bodies.

Our feet are designed to withstand the entire weight of our bodies as we navigate across unsteady environments, to help propel us forward as we walk, run, jump, and keep our balance while wait in line. In fact, the condition of your feet can offer clues to a host of medical issues, such as diabetes, arthritis, and even heart disease.

Every day, as we take thousands, after thousands of steps, it’s only natural that our feet will be affected by wear and tear. It’s estimated that around 75% of us will experience foot pain, and according to experts, most people have some level of misalignment in their feet.

Randomly occurring cramps are one of the most common foot problems. But regularly cramping can be an indication of major problems with circulation or nerves. Cramps can also be related to dehydration and nutritional deficiencies.

Most people experience swelling in their feet at one point in their lives, but there are times when swollen feet can indicate a more serious concern. The most common causes of swelling are usually simple, wearing restrictive footwear, taking a long flight, or a minor ankle sprain. However, if the swelling lasts for more than a couple of days, it might be a medical condition that bears investigating. Swelling in the feet can be a result of congestive heart failure, kidney disease or even as a side effect of prescription medications. It can also result from inflation stemming from osteoarthritis or rheumatoid arthritis so it’s important to seek a professional opinion if the swelling persists.

Despite most of us believing bunions are the result of pointy-toed shoes and high fashion, bunions are actually a sign of a flawed foot structure. In many cases the flaw is inherited, and merely aggravated by inappropriate shoes. In some cases where the movement of the bone is quite severe the only corrective option is surgery.

Plagued by neck and back pain? The culprit could be your foot! Structural or functional imbalances in the foot can put stress on the hips, back and neck. Our bodies are completely connected from head to toe, which is why an affliction in one area can easily affect a seemingly unrelated part of the body. If you have a pain in your foot, you will subconsciously change the way you walk to avoid causing further pain. Just thinking about when you’ve stubbed your toe or got a shard of glass stuck in the ball of your foot. You change your gait to avoid causing any additional damage. Usually this is temporary, but in some cases, where someone has an untreated injury or deformity in their foot, this gait adjustment can become permanent, and start causing problems in your other joints.

Some groups are more at risk of foot related pain and injury. Diabetes sufferers are at a greater risk of developing foot problems thanks to the damage diabetes can cause to nerves and blood supply. Keeping active and exercising regularly is vital for people with diabetes, and walking has been proven to help maintain healthy blood sugar levels. But having the appropriate footwear is also important, diabetic shoes or therapeutic shoes are specially designed to help prevent some of the common foot problems that plague diabetics. Having the appropriate footwear is also important whether it is for exercise or day to day activity.

If you’ve grown accustomed to your feet feeling like ice blocks, there’s a chance that your cold toes are a sign of a bigger problem. Peripheral arterial disease (PAD), and other forms of heart disease can cause your arteries to narrow, this impedes the flow of blood throughout the body. Extremities like our hands and feet are particularly susceptible to poor circulation. If you notice your feet are always cold or feeling numb, then consult your doctor to make sure there isn’t a more serious underlying problem.

Most people take their feet for granted, until pain or problems such as blisters or calluses develop. It’s important to be kind to your feet and take care of them. Wash your feet daily with warm, soapy water and remove dead, dry skin with a pumice stone and dry your fee thoroughly. A daily routine like this can help prevent common problems like calluses, ingrown toenails, corns and athletes foot! Vary your footwear, and make sure your shoes are the right fit, and ladies, avoid heels above the 2-inch mark for daily use. Regularly wearing very high heels can lead to a shortening of the Achilles tendon and cause pain and gait issues later in life. Be sure you’re wearing the appropriate footwear for whatever activity you’re doing, and be aware of unexplained pains in the hips, knee or back. The best defence is to treat existing problems as soon as they arise.

What is the Future of Computing?

Computers have almost been synonymous with the modern age. They started out as large mainframes, morphed into desktops, transformed into laptops and metamorphosed into the tablets and smartphones that we hold near our faces. The new era will be a far cry from anything witnessed, thus computers will cease to be devices that are turned on and off at will and become inalienable components of our lives.

Moore’s Universe

Moore’s Law states that the number of transistors on a microprocessor and the resultant computing power doubles every two years. The law has proved true since it was pronounced by Gordon Moore in 1965, but may not hold any longer. Its imminent end was predicted way back in 2000 when the MIT Technology Review warned that silicon technology cannot get smaller and faster beyond a point. One may add a note of caution that Moore’s Law isn’t really a law in the strict sense of the term as it does not describe an immutable truth such as gravity.

Post-Moore Future

Silicon technology may have reached a saturation point, but new chip materials and innovative ways of defining computing itself will continue to drive exponential growth of computing performance in the future.

Quantum computers

Quantum computers will leapfrog into the future as they will be based on quantum bits or Qubits which can be a zero, a one, both at the same time, some point in between, or all at once, unlike most sophisticated contemporary computers that only contain a one or a zero per bit. Quantum computers will thus be able to solve complex problems millions of times faster than conventional computers of today.

Biocomputer

Nanotechnology will facilitate the birth of a hybrid computer or a biocomputer that will perform parallel calculations by transferring multiple protein filaments along nanoscopic artificial pathways at the same time.

Neuromorphic computing

Neuromorphic technology is aimed at creating computers that resemble the human brain in processing and learning from data. The ongoing development of chips that train and execute neural networks for deep learning is a step in that direction.

In-memory computing

Random Access Memory (RAM) has long been the elephant in the room, with its tardy pace of drawing data from the Read-Only Memory (ROM) and the resultant wastage of processor power bogging down computer processing speed in the past. In-memory computing can speed up the processing speed by transferring a massive amount of data into the RAM.

Brain chips

Brain chips are gaining a lot of attention, thanks to the involvement of pioneers in technology such as Elon Musk and Mark Zuckerberg in their development. Dave Limp, the head of devices at Amazon suggests that the next phase in computing would be more about how and where it is accessed and less about the physical aspect. And as brain chips are located in the brain, accessibility cannot get any more convenient and fast.

With the expected speed of computer processing bordering on the insane, could bandwidth remain a laggard for long? Nielsen’s law observes that bandwidth doubles every 21 months, a pace that is slow in comparison to Moore’s law. But as 4g is already the prevailing norm and 5g is expected to roll out around 2020, the superhighway of the internet will only get better in the future.

No-Touch Interfaces would transform human-computer interactions. Gen X was comfortable with keyboards and mouse and the millennial generation is adept at texting at a blazing speed. But there is a paradigm shift on the horizon as no-touch interfaces such as Microsoft’s Kinect, Apple’s Siri and Google’s Project Glass will compel computers to adapt to human beings rather than the other way around.

Websites will be more dynamic and provide an enhanced experience to the users, thanks to JavaScript frameworks such as AngularJS. Developers and designers exposed to AngularJS Training will work hand-in-hand, with designers designing the pages with pure HTML and CSS, and developers adding the page functionality.

Internet of Things (IoT) will be the computer technology to look out for. Just as the internet of people connected people in unfathomable ways, the internet of things will create an ecosystem of connected devices that can be accessed through the internet. IoT devices are already making their presence. According to data provided by research and advisory firm Gartner, about 4.9 million things were connected to the internet in 2015. The number zoomed to a few billions within a year and 21 billion devices are estimated to be connected by 2020.

Smart home is probably the most popular IoT application at this juncture due to the affordability and easy availability of smart products for household use. Users can control hands-free Amazon Echo speakers and Nest Thermostat for temperature control with their voices, thereby making lives more connected than ever.

Watches are no longer mere time-keepers. Apple Watch and other wearable smartwatches have enabled text messaging and phone calls, while devices such as Fitbit and Jawbone have revolutionized the fitness world by providing more data to people about their workouts. Cars are being equipped with internet access and an ability to share that access with others, just like connecting to a wireless network. The cities of the future will be smart cities as the Internet of Things will resolve traffic congestion and reduce pollution.

Pervasive computing will be the technological norm in the future as computer elements will be incorporated into everything from buildings and highways to vehicles and clothing. Computers are poised to seamlessly blur the divide between digital life and real life. They will be as ubiquitous as the air that we breathe and yet so interwoven into our lives as to seem almost absent.

Education in Niger: Is It Beyond Saving?

Despite technological boons granting access to education worldwide, many African countries still suffer from a shortage of education funding and general Government support. The UN Human Development Reports for 2013 revealed the bottom ten countries in education to all be situated in Sub-Saharan Africa. Ranked last at 187thworldwide among countries offering data on education, Niger offers 1.5 mean years of schooling across 21 million tallies. With 31% dropping out in primary school and merely 5.2% going to schooling beyond that, it certainly earns its place as one of the least educated nations on the planet.

So, in order to entertain a solution, what exactly are some of the major contributors leading to these stats? Why does the Niger Government struggle to fund education?As one would expect, economic issues are a large contributor; although Niger saw economic expansion of 6% between 2012 and 2017 due to mineral exports, one source cites “lack of economic dynamism” as the greatest hurdle to sustaining that growth. The clearest way I can see of ‘kick-starting’ such dynamism would be the foundation of foreign business. This might seem absurd given the treacherous political landscape and Boko Haram presence. However, they have undiscovered mineral wealth across the entire country, with partially investigated deposits of Uranium in the states of Bayelsa, Bauchi, a fully discovered deposit in Cross River and other resources like Gold, Clay, Marble and Crude Oil across a plethora of its other states. These sites aren’t freely available for ‘looting’ either; they are protected under the 2007 Minerals and Mining Regulations Act, so it’s certainly possible for particularly adventurous enterprise to enter their market, but whether or not that would even generate jobs let alone stimulate the domestic economy sufficiently is up for debate. It’s likely such companies would fly skilled labour in rather than employ locals because those locals lack anywhere near the level of education required, which is a cruel irony that’s hard to overlook.

If affirmative action were implemented demanding a quota of locals be trained in service of the company, those companies would soon face a problem foreign to the developed world: the fact most of the population is illiterate, with merely 27.3% of the male populationand 11% of the female population knowing how to read. Therefore, these companies would need to teach basic literacy before any training in mining or anything else for that matter commences. Such an expensive undertaking, on top of already formidable risks make this entire notion far less realistic than it already was. So, if we assume this issue must be solved domestically without the assistance of foreign for-profit business, what is available to citizens of Niger in their pursuit of education? Although there are excellent institutions well versed in international student recruitment that could assist citizens in spite of their educationally defunct nation, even they require some baseline of literacy.

Despite the UNESCO benchmark (26% of GDP spent on education in developed countries), Niger spent less than 9%in June 2016. This is particularly concerning when considering the purported Government commitment to education. Strangely enough though, lack of Government education expenditure may not spell the end on account of significant foreign aid. Although outside assistance in this regard is hardly sustainable, it’s certainly better than no initiatives whatsoever. Since early 2016, non-profit USAID have worked with the Millennium Challenge Corporation to close the educational gap by working directly with the Ministry of Education in Niger to create empirically based policy decisions regarding education. The US Government even signed on in 2013 to aid in bolstering its education sector; they offer $10 million each year to this end, with France, Belgium, Germany, Switzerland, Canada and Saudi Arabia all making major contributions.

However, despite all this aid, simply throwing money at a problem is insufficient for progress. It would require exceptional logistical acumen to execute and enforce any education regulations or policy adopted under the watchful eyes of the global community. More to the point, Niger suffers from “systemic levels of corruption permeating all leaves of society” according to transparency.org. The aforementioned mining act, like many other pieces oflegislation, is hard to enforce. Companies are simply not transparent and are enormously difficult to audit. Therefore, knowing precisely where all this aid funding goes is never entirely certain. The police are under-funded and lack training, so the enforcement of any regulationsis equally susceptible to being circumvented by police bribery.

So, what on earth then would it take to salvage education across the population of Niger? We have established its lack of economic dynamism and lack of potential to gain this quality. We know it suffers from terrorist attacks, mass illiteracy preventing further baseline education, corruption and the potential for swindled aid funds. It is certainly a precarious situation, but not one devoid of all hope. In my opinion, current UN, European Union and US funding ought to go straight toward educational facilities established and run by contributing nations, with NATO peacekeepers assigned to these institutions in the event of attempted Boko Haram kidnappings, a junta or other insurgent group’s interference. Food and clean water can be provided as incentive for parents to send their children, a system currently utilised by the Indian Government to remove the burden from parents to feed their children. Graduates from such schools can then teach English to younger students throughout their secondary education, offering employment incentive to these alumni to keep studying while simultaneously supporting their household. Granted, such a system is wrought with logistical issues like sourcing food, but it removes the corrupt element of going through Niger institutions. Nor is it a system to cure all the issues of the country. The goal is simply to foster higher literacy rates among youths, for it seems from this foundation, higher education can spring forth. Therefore, I would consider the education system of Niger to be within reach of saving, but that may change without swift action.

Are cyberattacks growing more and more frequent?

Earlier this year, a team of hackers stole malicious software from the National Security Agency of the United States and executed damaging cyberattacks that affected dozens of countries around the world. As a result, Britain’s hospitals and public health services were forced to send patients away, computers at Russia’s Interior Ministry were frozen and more than 200,000 individual computers were compromised in over 150 countries. In Asia, students lost access to their theses and graduation papers and in China, hundreds of computers used to mine Bitcoin were among those infected.

Security experts claim the hackers exploited a Microsoft Windows operating system flaw to carry out the attack – a flaw that had been made public by hacker collective Shadow Brokers in April this year. The group pocketed over $1 billion from individuals around the world before the hack was shut down effectively.

It was a global crisis not without precedent.

In May this year, a ransomware virus called WannaCry affected 200,000 victims in 150 countries. In July 2009, a series of coordinated attacks were launched against government, financial websites and news agencies of the United States and South Korea using Botnets, impacting somewhere between 50,000 and 166,000 computers. The following year Stuxnet, the product of a joint United States and Israel cyber weapon project, was exploited by hackers to wreak physical havoc on infrastructure controlled by computers in Iran, reportedly destroying roughly a fifth of the country’s nuclear centrifuges. Then there was the 2007 Estonian cyberwar, the 2011 Citigroup hack and the 2008 hacking of the United States presidency campaign, to name just a few on the growing list of damaging attacks launched in cyberspace since the advent of cyberwarfare. And the number of cyberattacks being carried out each year appears to be growing at a troublesome rate.

In the first six months of 2017 alone, there were 918 data breaches which compromised 1.9 billion data records, according to according to the latest findings by digital security provider Gemalto. The most shocking part about this fact? It represents an increase of 164 percent in the number of records compromised compared to 2016.

So, are cyber security systems failing to keep abreast of the new, malicious technologies being developed? Are we as users not taking sufficient online security precautions? Or are the hackers just getting better.

Perhaps it is the latter. After all, the approach taken by cyber attackers has certainly changed in the past 10 years. While 15 years ago a suspicious email would typically come from an overseas Nigerian prince requesting vast sums of money, hacking tactics post-2008 have become far more subtle. Google security engineer Mike Hearn confirms that hackers now use mostly Google mail accounts to send spam emails and have begun accessing private accounts so that the spam requests come from trusted Google contacts – fooling even the most sceptical among us.

“Although spam filters have become very powerful – in Gmail, less than 1 percent of spam e-mails make it into an in-box – these unwanted messages are much more likely to make it through if they come from someone you’ve been in contact with before. As a result, in 2010 spammers started changing their tactics – and we saw a large increase in fraudulent mail sent from Google Accounts,” he wrote in a blog post.

Grid cybersecurity expert Robert M. Lee, CEO of industrial cybersecurity firm Dragos, Inc. claims that attackers are becoming more aggressive.

“They’re learning a lot about our industrial systems, not just from a computer technology standpoint but from an industrial engineering standpoint, thinking about how to disrupt or maybe even destroy equipment. That’s where you start reaching some particularly alarming scenarios,” he said.

Lee also said that because businesses are beginning to use more common operating platforms it is providing even greater opportunities for adversaries to wreak havoc on scales never-before-seen in the cyberwarfare space.

Cyberattacks are now the number one external risk factor facing businesses, according to 23.1 percent of 39 Chief Financial Officers surveyed by CNBC earlier this year. And the scale and type of attacks businesses risk being hit by will only escalate in 2018.

For a start, hackers will move from pure data theft and instead begin to compromise and change stolen data, causing longer-term, reputational damage to individuals, groups and corporations. Software updates will supposedly become a key tool for hackers, exploited to impact all customers in a software program’s supply chain. Smartphones will be compromised, with spies now capable of accessing and gaining personal information from your mobile phone from remote locations, and fake tax scams mimicking the Equifax breach (which affected an incredulous 145.5 million people) will once again cause widespread damage, with hacker viruses capable of infecting the software that businesses use to file tax returns.

So what can be done?

As individuals, you can not only be extremely vigilant in all internet dealings and follow your gut instinct if it is telling you something seems not right, but you can also routinely change your password, sign up for real time alerts from your bank, sign up to identity protection services and install antivirus protection. It’s also worth allocating one hour a week to checking and monitoring your credit card statements in case of compromise. As a company, it should be a prerequisite that you have an effective cybersecurity plan, an educated staff, installed firewalls and antivirus software and that you secure your Wi-Fi network. If you notice strange activity on your company website, suspect malicious software or feel as though your security has been compromised, it is well worth it to fork out for emergency repair services such as spyware malware removal. It is also worth investing in a cyber liability insurance policy.

 

Insuring and protecting yourself against the rising tide of cyberwarfare makes sense if you want to avoid being compromised by the 230,000 plus new malware samples released every day. It’s not that hard.

What is wrong with Healthcare?

Unless suffering from disease, most people are unaware how fragmented and ineffective healthcare currently is. It might be shocking to learn that millions of Americans suffer from inadequate healthcare, stemming from the lack of a proper system capable of accessing a comprehensive record of the patient’s medical history.

It might seem like a no-brainer, or a given that a person’s medical history should be instantly accessible to medical staff or insurers and most importantly, that person who owns it. And yet it is still a dream yet to be realized.

Back in 2009, an effort to digitalize patient records by a legislation set down by Health Information Technology for Economic and Clinical Health (HITECH) failed to realize the dream of a united database, accessible to relevant parties. Eight years down the line and with an allocation of $30 billion, HITECH still has little to show for. Their biggest problem was that there was not any technology capable of a seamless and widely adopted database, until now.

The birth of bitcoin made way for more than just encrypted and an effective form of virtual currency, its innovation has led healthcare to their answer. While patient information used to be stored on private servers, a blockchain (which is what cyptocurrency employs) is able to open the doors for data, with everyone possessing the potential to create a mini-server. As explained on Laisesez Faire, traditional medical records are kept on “servers run in their own isolated code silos, making it really hard to share data. However, with blockchain, anyone can create a miniature server on their own device that replicates all the necessary parts of other miniature servers so there is no one place to hack, break, or lose information.”

Blockchains are used originally by bitcoin by “securing transactions through decentralization”, making sure that transactions remain secure and anonymous. It also prevents what is known as “double spending” whereby an amount is spent doubly through a induced glitch. The system records part of a transaction in various computers, rather than all in one place; much like giving everyone a piece of the puzzle and coming together in accuracy as it cannot be “faked or changed”.

Therefore, medical data stored on blockchains will also have the same amount of security, without the potential for being stolen along with transaction details and the like. Perhaps in the future, it would be used for recording historical events or political affairs, to keep it honest and free from tampering.

The one public concern regarding blockchains is that problem of how much energy it requires. According to Huffington Post, one bitcoin currently stands at the cost of twenty barrels of oil. It is unsustainable for the use of one cryptocurrency, much less to say having political, historical and medical use added to the mix. However, there are new innovations looking at ‘greening up’ the technology, such as EcoChain, powered by Power Ledger that utilizes surplus solar energy. But who is incorporating it into healthcare?

Patientory is one such company looking to make us of the blockchain technology in healthcare. They are putting “Electronic Medical Records (EMR) onto an Etherum-based blockchain”, and the CEO of the company released a statement regarding how fragmented the healthcare system is and that the company would bring the industry together “as a collective toward reducing costs and improving not only the U.S. healthcare infrastructure but the global healthcare ecosystem.”

DeepMind is also one such company looking to profit from the system, launching a ledger for digital medical records with blockchain. With so much interest in the technology behind cryptocurrencies, and the resources being poured into developing more practical solutions for data control, it does not seem to be a passing trend.

Even Mark Cuban, a bitcoin skeptic, has thrown himself into what he once called a “bitcoin bubble” and invested in a $20 million investment called 1confirmation, believing that cyptocurrencies will be adopted by healthcare and finance. 1confirmation is a crypto venture fund, founded by Nick Tomaino, aims to fund the decentralized network, allowing new burgeoning cryptocurrencies to gain a foothold in a virtual world where, with the rise of cryptocurrency prices, they are popping up like mushrooms.

A pharmaceutical company planning to expand on anti aging products has pitched an idea for the public to earn cryptocurrency by selling their medical data – blood tests and selfies. The company is called Insilico Medicine, Inc and they claim that the average seller would be able to earn a substantial income through weekly submissions.

Another idea of incorporating healthcare and blockchain technology, comes in the form of MedRec. It is purely used for storing and sharing medical information and in return for fueling their blockchain, miners will receive “access to aggregated and anonymized data from consenting patients”, something which has been done to great success.

From being used to purchase illicit material anonymously to paving the way to a promising future in data storage, cryptocurrency truly has come a long way, bringing with it a great change.