Huawei’s rival Apple and social network giant Facebook also have AR ambitions. Original Link
Somehow, an Apple event, even one where we knew most of what was due to be announced beforehand, still generates excitement and anticipation like no other. Even I can’t resist casting a casual eye to see what rumors they will announce. Mostly because typically someone in Berlin arranges an event where we all drink a lot and laugh at the over the top production of American events. I digress, what did Apple announce, and what does it mean to developers and other tech-minded people.
Starting with what Apple didn’t announce. Despite many rumors or hopes, Apple did not announce a new MacBook Air, or further cement the Mac in a concrete coffin with new iPads. To be honest, I don’t remember that many Apple events where they did announce multiple hardware series at the same event, so I’m not sure why people keep expecting it. These devices may come soon, and they might not, only Apple knows.
When Google and Apple announced their mobile augmented reality (AR) platforms (ARCore and ARKit, respectively) last summer, it sparked interest in 3D from a potentially huge group of designers and developers who have traditional 2D mobile application experience but are new to AR.
Accustomed to mature tools that formed (mostly) integrated workflows, these newcomers found a 3D workflow cobbled together using legacy tools and design patterns borrowed from gaming, video and film entertainment, and the architecture and engineering disciplines. This hodgepodge made the learning curve steep, development expensive, and cross-discipline collaboration nearly impossible.
To complicate matters, there were no resources – like how-tos, reference applications, and design guidelines – to help. Designers were forced to borrow liberally and make things up as they went along, resulting in an often awkward marriage between mobile gaming or architecture and traditional 2D UX patterns.
This improvisation was no surprise, given the newness of the platforms. It would take time for patterns to emerge, then be evaluated, classified and organized into something systematic.
The recent announcement of Google’s Augmented Reality Design Guidelines (GARDG) caught our attention. How far had design guidelines progressed? Did they reflect the priorities or address the needs of designers, both as expressed to us during interviews and as we have experienced while building a complex 3D application? Most importantly, would we recommend these guidelines to designers looking to get started in AR?
The short answer to the last question is “yes.” But the caveat: We’d recommend them only to start, and for a few critical concepts.
Overall, the GARDG excels when it encourages designers to build applications that focus on motion and environmental engagement. It excels in drawing attention to the critical role movement, specifically user movement, plays in AR. Mobile app devs must account for new device orientations driven by camera position. Users no longer simply cradle a device. They hold it more deliberately. Therefore, designers must account for user fatigue and whether UI elements could cause users to cover the camera, especially in landscape mode. Different device types and weights also impact a user’s experience. A tablet offers a larger screen, but weighs more than a phone, potentially increasing user fatigue.
GARDG reminds designers of one of the most overlooked aspects of AR in our experience: end-user mobility and how this shapes interactions with immersive designs. Consider if the user is on a plane, uses a wheelchair or is unable to move or hold a device. In order for everyone to have access, developers should design for four user modes — seated, hands fixed; seated, hands moving; standing, hands fixed; and standing, hands moving (which would be full range of motion). Intertwined with its guidance on mobility, the GARDG stresses the outside environment and ensures designs never sacrifice user safety. Making users backup blindly, or encouraging them to move forward while the device is pointed in a different direction are strongly discouraged.
The GARDG struggles, however, to keep pace with the demands of designers and developers. And it reveals how both Google and Apple may well trail behind users in understanding the potential of their platforms for leveraging 3D to accomplish complex tasks or communicate complex experiences.
The current design guidelines make no allowance for complex mechanics – or really anything behind simple object placement and sticker-like functionality. Designers are attempting to build apps that include interactivity. This includes object selection, conditional behaviors, branching scene flows or storyboards driven off of user behavior; movement between scenes using teleportation; portals and lots of physical gestures.
The GARDG is missing multi-scene use cases, which automatically excludes many modes of interactivity or conditional behavior that leads to transitions, complex or more interesting changes of state, personalization, and ultimately a deeper, more immersive experience.
Similarly, there is no discussion of animations (a common topic in our interviews with designers), either triggered or timed, or the notion of a shared or collaborative environment. One of the most sought-after features is the ability of one user to either collaborate in the same scene from the same location on different devices or, in some cases, to share the same camera view. The latter example is one we frequently encounter when designers wish to allow remote collaborators and clients to see their AR prototypes in the environment for which they were intended and to provide feedback, all in real time.
Even in the relatively tame realm of static object creation and placement, designers are already searching for the best way to design for complex behaviors, such as selecting objects that might be hidden by other objects.
The guidelines make assumptions about optimal object placement range – within the reach of the user — that we see no reason to codify at this time. What about throwing objects as a method of placement, or pointing, grabbing and interacting with objects at a distance?
Overall, the GARDG is a good starting point. Most importantly, it situates the user and their needs, in particular how they interact with the physical environment while using augmented reality, squarely at the center of the design guidelines. The overall tone of the guidelines focus mainly on being mindful of humans out in the world using your (simple) application and that is okay.
But for designers in a hurry, this conservative approach by Google (and Apple) presents both an obstacle and an opportunity. Current workflows available for these platforms increase friction, cost time and money, and ultimately limit the projects. Designers already are developing clever workarounds and companies like CameraIQ, Wikitude, and my own company, Torch 3D, are starting to develop tools.
This brings us to the opportunity. The patterns and best practices of mobile AR UX design are wide open for anyone to help define. What works well will win. Innovations by the pioneering designers of today will inform the design guidelines of tomorrow.
The mobile AR space may seem chaotic and disorganized right now. But in a matter of years, I believe we will look back fondly on these days as ones of incredible experimentation, freedom, and learning.
Paul Reynolds has been a software developer and technology consultant since 1997. In 2013, after 10 years of creating video games, Paul joined Magic Leap where he was a Senior Director, overseeing content and SDK teams. In 2016 Paul moved to Portland, OR where a year later he founded Torch 3D, a prototyping, content creations, and collaboration platform for the AR and VR cloud.
On April 19, Viro Media will launch image recognition for Android/ARCore on the Viro developer platform. Recently, Apple (ARKit 1.5) and Facebook announced support for image recognition on their respective platforms. Viro now enables this key AR feature for Android developers. With image recognition, developers can find and recognize the position of 2D images.
Image recognition will enable AR experiences such as interactive exhibits at museums, bring to life movie posters with 3D characters, and assist with navigation through markers and signs.
Viro’s image recognition feature is available across both their developer frameworks, ViroCore and ViroReact. The Viro Developer Platform is the perfect alternative to specialized game engines for building AR apps: it allows companies to focus on what they do best, in the languages they know best, instead of training or hiring specialized 3D developers.
“Viro’s mission is to enable AR/VR everywhere by giving developers the tools they need to build amazing immersive experiences. AR gets better, the more we understand the real world. Image recognition is a big step toward unlocking new use cases for AR and we are excited to bring this to Android and across the Viro platform.” – Danny Moon, CEO, Viro Media
Since launching AR support in October 2017, Viro has seen a spike in developers using their platform. When ARCore launched, 10% of the non-gaming AR apps were powered by Viro including:
Food Network: “Our team was facing a tight deadline on an AR project, and needed a solution give us a head start on 3D rendering. Viro came along at just the right time with a solid, intuitive API, extensive documentation, and responsive, knowledgeable support. It was a pleasure to use and allowed us to get to product launch with time to spare. We’re definitely looking at using Viro in our future AR projects.” –Josh Williams, Android Lead, Food Network In the Kitchen
Streem: “The Viro platform and team were essential in getting Streem’s remote AR experience into the hands of our Android users as soon as we did. The product exactly fit our needs, and the team was incredibly responsive and helpful during our development stage. We would not have cross-platform AR right now without Viro” –Sean Adkinson, CTO, Streem
Filmr: “It is a unique analogue of SceneKit in Android which already has various tools and supports ARCore. Also, Viro support is really responsive and fast! We wouldn’t have achieved such progress so fast without Viro!” –Alex S., Android Developer, Filmr
The Viro platform is free with no limits on distribution and easy to get started.
At DashBouquet, we are keen to learn new things and constantly improve our skills and knowledge. That’s why we always keep an eye on the most modern trends in order to stay atop the competition and not only satisfy the requirements of our clients but also the needs of the users. So we thought it would be a good idea to share with you the most expected UX design trends for 2018 and see why they will matter so much.
Today the world revolves around the concept of immediacy. Snapchat and Instagram stories, real-time streaming, and much more — people are now used to the fact that they can always see what others are doing right now and they expect such options to be in almost every app they use.
Swiftly moving away from the world of web apps and desktop accessibility, 2017 saw an upsurge in the number of users choosing mobile to be their primary option for internet access. Unlike what skeptics predicted, mobile app development was no bubble, nor was it an impermanent trend destined to run out, like an iPhone battery running on iOS 11. Mobile apps are a culture we have all grown accustomed to, from their presence in location-based apps, to their development on the spectrum of augmented and virtual reality.
In 2018, with respect to mobile app development, we will either be seeing brand new trends, or a huge upgrade from what users are already using. Before we dig into these trends, it is important to run through the anticipated hardware upgrades that will fuel the future of mobile app development.
The primary design of the smartphone is all set for an overhaul. Given the screen design introduced by Apple in their iPhone X, the likes of Samsung and its Android contemporaries in China will be looking to introduce devices without buttons. At the risk of sounding overly optimistic, one can hope for bendable designs to be made cost-effectively.
Of the 254-billion apps that are estimated to have been downloaded in 2017, over 90% would have been slaves to network speeds, especially in the markets of Africa and Asia where infrastructure remains an impending constraint. They need mobile app performance enhancement. With 5G network trials set to be underway across the world, the realm of app development will witness a change. The integrated chips powering our smartphones are being improved as I write this, and in 2018, they are set to go a notch higher, thus allowing mobile app developers greater room to play, when it comes to app intricacies. Fans of augmented and virtual reality can already feel the surging adrenalin rush.
These enhancements in hardware are going to pave the way for mobile app development trends in 2018. For startups that have been hit by a plague of failures since 2016, this year could be a time of redemption, if it manages to ride high on the following ten trends in mobile app development.
AMP listings were integrated into Google search in 2016, and since then developers have not looked back. Inculcating them within the app framework, developers have been able to use this boiled down version of HTML for better user experience and retention, with Facebook Instant Articles being one of the many success stories.
Thanks to Apple, the affordability constraint is out of the equation. App developers, starting in 2018, will be looking to develop apps for wearable devices, mostly watches. Currently, the likes of Zomato and Uber have invested in wearable app development, but like most, they only have scratched the mere surface.
‘Pokémon Go’ may have been a temporary storm on the eastern seaboard, but AR and VR are here to stay. Predicted to reap over $200 billion in revenues by 2020, developers are expected to create breathtaking mobile app experiences in AR and VR, and with compatible hardware entering the market, we can’t wait to get this party started.
It took years but the world is finally waking up to the possibilities offered by cloud computing and integration. Streamlining operations, reduced costs in hosting, better storage, and loading capacity, along with increased user retention, are few of the many advantages of developing mobile apps over the cloud.
Yes, yes, we know it is not anything new, but with Uber coming out of the metaphoric closet and accepting the hacking, app developers will be looking to invest more in cybersecurity, given it is directly linked to users’ data privacy and protection laws. The finest minds in the industry will have to up the ante to drown out the uncertainty around mobile apps.
Mobile apps are going to move on from being mere utilities to being an integral part of your workflow. Giants like Facebook, Google, and Apple are already employing AI to use predictive analytics to enhance the customer journey across the UI/UX of the app, and 2018 is set to witness advances in this field.
What was once being termed as an inevitable bubble in the realm of mobile app development, is now the future. With industries embracing the on-demand business model like an old friend, one can expect UI/UX enhancements, m-commerce facilities, predictive analysis, and business bots, to fuel the growth of Uber-like apps in 2018.
At the risk of sounding like a cliché, one cannot emphasize enough about smartphones being the future, given how trends in mobile app development have managed to captivate users across the globe for the last 3 years. As we usher in another year, inquisitiveness and excitement galore, the future of mobile app development resides in some of the finest brains in the business.
Black Friday 2017 was the biggest mobile shopping day in the United States. It was the first day ever to see more than $2 billion in mobile shopping revenue.
Demandware’s Rick Kenney says that for the first time, mobile devices (46 percent) beat out computers (45 percent) for the most orders that customers placed on Thanksgiving Day. That’s up from 31% in 2016!
This massive shift toward mobile is hardly isolated to Black Friday. Consumers are increasingly using their smartphones as their primary shopping device.
Even beyond direct spend, mobile devices have an enormous influence across the entire shopping experience. When they’re at a physical store, for example, shoppers might use their smartphone right up until the moment of purchase. Customers check their phones to compare prices online, read reviews, see if the new tech gadget they want is compatible with their computer, and more.
Although the trend toward mobile shopping may be obvious, what’s less obvious is identifying the ideal mobile experience for your customers. However, one thing is certainly clear: With so much time spent on mobile, consumers don’t just want to use an app, they want to have a great, seamless experience.
The only way to compete with online retail juggernauts like Amazon is to innovate through experimentation. Amazon CEO Jeff Bezos said that “our success at Amazon is a function of how many experiments we do per year, per month, per week, per day.” This long-term experimentation mindset is likely one of the reasons why the company was able to draw in an incredible 55 percent of all online transactions on Black Friday this year.
You don’t need to be Amazon to take their experiment-led mindset. One of the benefits of testing? The ability to safely think outside the box.
For example, furniture retailer IKEA is experimenting with augmented reality (AR) to allow customers to “look” before they buy. Users of the Ikea Place mobile app can search for furniture and then overlay a digital rendering of the product on a photo of their home. As the first point of contact for many shoppers, mobile devices will play a key enabling role for AR, thanks to continual improvements in their computing power and graphics.
Another mobile-enabled shopping innovation is driven by retailers such as Walmart, which recently introduced a “Scan & Go” pilot program in select stores. Using the Walmart Scan & Go app, in-store customers can scan the barcodes of products they want to purchase, pay for them with a single tap, and then show the receipt to an employee without having to stand in long checkout lines.
While the biggest retailers like Walmart are experimenting with the nature of shopping itself, you don’t have to disrupt the industry to see the benefits of experimentation with the mobile shopping experience. By bringing an experiment-driven mindset to every facet of your business, you can make incremental improvements that lead to big benefit.
From the UX you choose to surface, to the marketing copy that drives conversion, to the steps a user takes to become a customer, every business decision can be based on data-driven insights. Mobile technologies that enable experimentation have the potential to unlock these insights and grow your business.
The advent of smartphones has revolutionized the commerce sector. Its impact has been indelible in almost every field of business. It has even birthed an entirely new market termed “e-commerce.”
Since its inception, e-commerce has made rapid inroads into the retail sector market. The situation has come a to point where some market analysts are of the opinion that e-commerce is the death knell for retail, while some still throw their weight behind retail holding its ground. Certain stats do support the latter section’s claims; for instance, a report by Fung Global Retail and Technology states that Amazon.com’s share in US apparel market is a mere 3.7%, including third-party sales, for the year 2016. The same study also states that no e-commerce stores made it to the top 10 preferred outlets. At the same time, the fact that e-commerce has disrupted retail cannot be denied.
The CnC approach (CBC.ca).
Let’s take, for instance, the grocery industry. E-commerce is reshaping the grocery market in innovative ways. Advancing technology has opened up ways in which your grocery shopping experience will be as good, if not better, than shopping physically in the store, all within the confines of your smartphone and without taking a step outside your house. Two intriguing features provided by e-commerce grocery stores are Home Delivery (HD) and Click and Collect (CnC).
HD is purely an e-commerce product that involves the consumer visiting the grocery seller’s app or website, adding items to a virtual cart, and finally paying and checking out. The purchased groceries are delivered to the consumer by the seller. CnC, in turn, involves the customer selecting and adding items in the grocery app and then driving to the nearest physical store of that online grocery store and collecting their products. Thus, CnC is a hybrid between a traditional store and an online store.
Ferrari’s AR app (Augmented Reality Trends).
Augmented Reality (AR), initially considered a gimmick, is now seen as a feature with tremendous potential for application in the e-commerce sector. The recent patent application filed by Amazon for AR-related features has put it in the spotlight. Basically, AR helps bring the store to your home, quite literally. Features like virtual changing rooms and customization help consumers make a more informed decision while making a purchase. Here are a few AR apps:
Despite all this, just how much of an impact has AR and e-commerce actually had in retail?
Some studies paint a picture in favor of retail. For instance, a recent Food and Beverage Report released by Forbes states that the e-commerce share in the grocery industry barely registers- it struggles to pip the 2% share mark. A minimum of 6 online purchases annually is made by no more than 5% of adults. Contrastingly, traditional grocery stores see 78% of adults make purchases regularly, and Walmart has 56% of the shopping. Interestingly, the survey also showed the growth of online shopping by 14% as compared to a mere 2% increase for physical stores. But hold on, do those growth figures really tell the whole story? When scrubbing through data from the Food and Beverage Report, it was found that the online grocery’s growth from 1.4% to 1.6% of the market share totals up to an increase of $1.6 billion dollars. The 2% increase shown by physical stores totals up to $16 billion, 10 times the growth shown by online grocery stores.
There are also reports that tell a starkly different tale. According to a study conducted by retail advisory firm HRC Advisory, operating earnings fell by 25% due to a shift to online sales from physical stores. Additionally, omnichannel and e-commerce investments, along with expensive fulfillment of e-commerce transactions, contributed to the decline.
Variable logistics cost, IT, supply chain upgrades, and simultaneous maintenance of the high level of returns on online operations together are producing increasing SG&A costs (at approximately 2-3 percentage points of the net sales made). This, combined with wage inflation, falling physical store sales, and real estate has resulted in a decrease of 1-2 percentage points in the profit contribution by physical stores, concluded the study.
Whatever its impact, e-commerce is here to stay. It would be interesting to see how this competition unfolds, or whether e-commerce actually spells doom for retail. Although statistics are divisive and inconclusive, this is for certain: the arrival of e-commerce, supplanted with features like AR, has disrupted their unchecked reign over the market, and if retail doesn’t adapt to the changing needs in these changing times, the divisive studies might soon send it out for the count in unison.
Been a while since we got around to making a post here – these days, if you want to keep up with what we’re doing you have a much better chance of doing it over at our travel blog everyworldheritage.site … or following @everywhs on Twitter … or subscribing to us on Youtube … or Instagram … or Facebook … or even Pinterest. Yep, we’re giving a full-on shot at this “travel influencer” thing.
We have started to get a bit of the hang of what we’re doing over there though, so we will be posting here a bit more frequently. For starters, let’s catch up with where we left off telling you we’d been invited to Moscow for MBLTDev! We are fairly blasé about the regular run of shows … but one in Moscow? Now that’s fun! And it went pretty well considering- we talked up on it a bit here over on the new blog:
Yep, there we are with our audience participation ARKit demos in progress, wearing our new signature Indonesian batik dragon presentation shirt. Want another look?
More photos in this Facebook album, and if you’d like to check out the slides, they’re the first download here on Google Drive; the video itself appears to be restricted to participants, or at least people with an MBLTdev account, but if you do, check it out.
If you’d like to take a look at our demos, they’re up on GitHub:
ARSelfie: Demonstrates adding an SKLabelNode to a tapped ARAnchor in an SKScene with the current top result from a classification MLModel.
ARWorld: Demonstrates adding an SCNNodev collection to an ARSCNView representing the results of an MKLocalSearch for nearby POI.
ARCloud: This was the … overenthusiastic … stretch goal here, we’d planned some Really Cool persistent presence demos using the Estimote gear, as iBeacons are one of those things we’ve never quite had the opportunity to get into as much as we wanted to, so this was a good excuse! Well, that didn’t quite work out; we went to our Plan B of just shoving some ARKit glitter on top of a stock proximity detection sample pretty much. But it all came off amusing the audience to all appearances, so hey we’ll call that a win.
So that was a great change in every way from the glorified product demos that have tended to mark our speaking career so far, and it was definitely a good conference all around; if you have a hankering to visit Moscow, definitely recommend you plan it around an MBLTdev conference!
[unable to retrieve full-text content]Original Link
Our mission at Viro is enabling AR/VR everywhere by building tools that simplify development. Enabling more developers to build AR/VR experiences will lead to a better, larger and more diverse ecosystem of apps. We started with ViroReact, our AR/VR platform for web and mobile developers leveraging React Native. With the launch of ARKit, we saw how Apple democratized AR development on iOS with SceneKit. We wanted to offer that same experience, native performance with descriptive API’s, to Android developers with ViroCore. (Read what XDA has to say about ViroCore.)
The Viro platform is free with no limits on distribution. Sign up for an API key and start building AR/VR apps today using ViroCore or ViroReact.
ViroCore is SceneKit for Android developers using Java. ViroCore combines a high-performance rendering engine with a descriptive API for creating immersive AR/VR apps. While lower-level APIs like OpenGL require you to learn and precisely implement rendering algorithms, ViroCore requires only high-level scene descriptions and the events and interactions you desire. Easily add animations, physics, particle effects, and more to your Android applications.
ViroCore is the perfect alternative to specialized game engines for building AR/VR apps. It allows companies to focus on what they do best, in the languages they know best, instead of training or hiring specialized 3D developers. ViroCore supports ARCore, Google Cardboard, Daydream and Gear VR.
With ViroCore, developers have access to a feature-rich platform necessary to build robust AR apps:
You can build your first AR/VR app in minutes. Just sign up for a free API key and follow our easy Getting Started instructions. For more detail check our extensive development Guides and Javadoc API reference. We are excited to see what the Android community builds with ViroCore.
ViroReact now supports ARCore, in addition to ARKit, making it fully cross-platform compatible for mobile AR development. Developers can use one code base for their AR/VR apps across iOS and Android. Current ViroReact developers, your ARKit apps should work out of the box on ARCore!
ViroReact brings the best features of React Native to AR/VR development: declarative API’s, flexible layouts, responsive components and cross-platform support. Viro enables fast and iterative development by offering testbed apps for iOS and Android, eliminating the need for Xcode or Android Studio while developing. Build immersive standalone AR apps or add features like Snapchat’s AR effects into existing apps with ViroReact.
Getting started with ViroReact is easy. Sign up for a free API key, then follow our Quick Start Guide to be set up in minutes. Check out our tutorials and code samples to start building your own app today.
As virtual reality (VR) technology attracts both consumers and investors, a wide range of startups are taking AR/VR beyond headsets to give users a more authentic experience.
Scheduled to be held from November 30 until December 2 in Seoul, Korea, Startup Festival 2017 is one of the leading events in Asia’s startup scene. Hosted by the Ministry of SMEs and Startups, the festival’s agenda focuses on the technologies driving the Fourth Industrial Revolution, such as IoT, fintech, ICO, and AR/VR.
Here are three AR/VR startups you need to watch out for at the festival:
Does VR make you sick, literally? Many people experience simulation sickness when playing VR games. If you’re seated but your visual cues signal you’re walking, then chances are you will experience dizziness, nausea, and other symptoms related to motion sickness.
Your body feels disconnected when what you see doesn’t match what you feel. The evolutionary explanation for this is that your body assumes that you’ve been poisoned, so it tries to induce vomiting to cleanse your system.
Wizdish’s ROVR is a VR treadmill that allows a person to walk and move freely in VR worlds. The treadmill listens to the sound made by sliding feet and converts this into forward motion in games. This feature allows you to fully engage with your game and matches the visual input with your physical stimuli. Less sickness, more motion.
Imagine you’re walking through a cave. Your eyes would probably dart to the dark shadows on the wall and the faint light from your torch. Your ears would also likely perk up because sound is an important aspect of “being present” in any given environment. If a bat were to fly over your head, the flapping sound from its wings wouldn’t be the same as what you’d hear if it were to fly beside you.
Traditional audio recordings, however, can only account for the sound from one fixed point – where the microphone was placed.
With 3D audio, the virtual sound in headphones is designed to come as close as it can to sounds in the real world. 3D audio startup Kinicho’s novel approach to producing 3D spatial audio in VR/AR helps developers take better control of their soundtracks.
To deliver a more authentic VR experience, Kinicho’s method takes into account the spatial relationship involving listeners, sound emitters, and the environment in a virtual world.
Funded via Kickstarter, Altergaze merges the concept of “crowd manufacturing” with AR/VR. The result is a 3D-printed, smartphone-based VR headset that offers an immersive 110-degree field of view (FOV) experience – and it comes in a compact and wireless package.
Using 3D printing technology to create a product offers a high level of customization. At the moment, Altergaze boasts of over 8.4 million unique variations depending on the design model, smartphone size, and color combinations. The visor looks vaguely like the goggles worn by the minions in the popular cartoon Despicable Me.
The headset is compatible with any smartphone, regardless of platform and display size. Moreover, it uses a device that almost everyone owns: a smartphone. Just slide it in the Altergaze headset, and you’re good to go.
Catch a glimpse of these three promising startups at the Startup Festival 2017. At the events ground, startups and VCs will have the opportunity to network and hold one-on-one consultations. There will also be an On-Air Zone where startups can gain media exposure.