Social Marketing vs. Social Engagement

Social Marketing vs. Social Engagement

How do you pick a voice out of the all the chatter, and attend to what they’re saying, especially when the conversations around you are louder than the one you’re having? This is called the cocktail party problem. Yet the brain is able to defuse the ambient sound and channel the voices we’re interested in. In scientific terms this has been explained as “selective cortical representation of attended speaker in multi-talker speech perception.” No matter how you describe it, although we give it little thought, it’s a remarkable and critical ability we use every day.

Social Marketing

I often hear this question, in one form or another, “How can I, or my Web site, be heard in the worldwide crowd?” It’s easy to view the Web as a cacophony of voices in which it is impossible to expect to be heard over all the noise. That’s a reasonable conclusion. Why would you want to be heard over the voices? When you enter a crowded room are you inclined to yell in order to get everyone’s attention? It doesn’t make sense and yet some consider this to be the aim of social marketing. It’s an unattainable goal. A better approach is the cocktail party problem. Rather than trying to get everyone’s attention, enter the room, look for someone to talk to, ignore unwanted noise, and pay attention to what the other person is saying. This is exactly how we all “work the room” in real life. Also, don’t forget, the most interesting people are very good listeners. The Bible put’s it another way, “Be quick to listen, slow to speak.” (James 1:19) This is excellent advice and a principle that works online as well. Rather than clamoring for attention, if you listen carefully to online conversations, you can filter out posts about your brand, your products, customer issues, trends, and so on, allowing you to engage meaningfully with customers to bring to them real value and to bring to your business real money (to put it bluntly). More on social listening in a moment (below). Now let’s talk about social engagement … (click on header below)

Social Engagement

Social engagement is so much more effective than social marketing. Let’s take a moment to discuss social marketing. Marketing implies getting your name out there, brand recognition, top-of-mind awareness, and a million other buzz words. Abbreviations like SEO, for search engine optimization, and SEM, for search engine marketing, are the catch phrases of social marketers and unfortunately the esoteric language of online “snake oil salesmen.” SEO and SEM are passive marketing approaches that wreak of old world thinking. Consider it mathematically. Let’s say there are ten search results on Google’s home page and thirty car dealerships in your town. They all hire the best SEO and SEM marketers in the world. Can they all achieve first page ranking? For the sake of argument let’s say we lived in the world of quantum mechanics and they did manage to get thirty dealerships in to ten results, will this translate into clicks? Does this automatically translate into real business? Show me the money!

Sometimes very successful brick-and-mortar businesses appear to lose their minds when they open an online storefront. To illustrate let’s create a hypothetical retail store called Fred’s Fun Stuff. Fred spent years building a business, building relationships with customers, carving out a niche, and as a result he enjoys increased profits year over year. He knows he needs to go online so he launches an online store. He buys all the hoopla about social marketing and spends out the wazoo to get good search engine placement. It works, he gets good site traffic, but he’s not selling much online. He learns that nearly everything he has in his store can be bought on Amazon for less so he starts to cut his prices to compete with Amazon. His customers find out they can buy from him cheaper online. Now he’s moving the same amount of products, has less foot traffic in his stores, and due to smaller margins he has less profit. Well done Fred! The problem? Fred lost his mind.

In his brick-and-mortar store Fred doesn’t try to compete on price with the likes of Walmart. Why? He knows he can’t. So what does he do? Fred listens to his customers, engages with them, supports them, cares about them, and he builds relationships. “I go to Fred’s because of the personal attention.” “He has the coolest store and lets me try stuff out. If I don’t like it I can bring it back.” “Fred always stands behind his products.” You get the point. In a nutshell Fred does stuff Walmart can’t and won’t do. These are called differentiators and it is one reason some retailers stay around when others go under. There are many differentiators that make your “real world” business successful. The key is to bring those same factors to your online venture. Success tends to beget success.

Social Listening

In the cocktail party problem it’s all about attention and how that attention can change your brain. The same is true for social listening. A system has to be put in place to serve as your ears to listen in on all the conversations on Facebook, Twitter, Instagram, LinkedIn, Google+, and so on. Of course listening requires more than just ears. The system must be able to filter conversations by triggers like products, competitors, geographic location, key phrases, and then be able to bring these conversations to your attention. A dashboard shows you the conversations you want to “hear” in real time. Now you can engage. This is active, not passive, marketing.

To illustrate let’s say you’re listening to social networks for any mention of a competitor and you find several customers complaining because a product is out of stock. You have the product in stock so you engage the customer and let them know about the availability. Not only have you made a sale but, just as important, you’ve begun a new relationship. Relationship marketing, the differentiators, kick in, focusing on customer retention.

I don’t mean to give you the impression there isn’t a place for SEO and SEM. These are important components for the same reason you may list your number and put a sign above your brick-and-mortar business. There is a place for passive marketing as there is a place for advertising. Online success isn’t about achieving good search engine ranking only to wait for someone to click on your site in hopes of a sale. Online success is about pursuing relationships. To put it in the framework of the cocktail party problem: You enter the worldwide social networking mixer, you have on your SEO/SEM name tag, and you listen to the conversations to identify that next opportunity. You work the room, you engage, you follow up, and you build relationships.

Azure Synapse Analytics: Powering data exploration and data warehousing with new features

Azure Synapse Analytics: Powering data exploration and data warehousing with new features

This article is contributed. See the original author and article here: https://techcommunity.microsoft.com/t5/azure-synapse-analytics/azure-synapse-analytics-powering-data-exploration-and-data/ba-p/1695416.

Azure Synapse Analytics brings the worlds of data integration, big data, and enterprise data warehousing together into a single service for end-to-end analytics, at cloud scale. This week at Microsoft Ignite we announced several features that bring accelerated time to insight via new built-in capabilities for both data exploration and data warehousing.

 

As we dive into each new feature, we will use the terminology below to identify where the feature is applicable. For the SQL capabilities in Azure Synapse Analytics, the main resource used is called a SQL pool. This resource has two consumption models: serverless and dedicated. The serverless model provides transparent compute consumption and is billed per data processed. The dedicated model allows use of dedicated compute, comes with capacity model and is billed per DWU-consumed. This new terminology will appear in the product soon.

 

IgorStanko-SynapseSQL_0-1600798567728.png

 

Accelerate time to insight with:

 

  • Power BI performance accelerator for Azure Synapse Analytics (private preview)

Last year when we announced Azure Synapse Analytics, we promised to bring Microsoft’s data and BI capabilities together to deliver optimized experiences for our users. Today, we continue expanding on that promise with the announcement of the Power BI performance accelerator for Azure Synapse Analytics, a new self-managed process that enables automatic performance tuning for workloads and queries ran in Power BI.

 

As Power BI users run their queries and reports, the performance accelerator monitors those queries behind the scenes and optimizes their execution thus significantly improving query response times over the latest data. It analyzes all Power BI queries holistically and intelligently creates materialized views within the SQL engine while recognizing common query joins and aggregations patterns. As Power BI queries continue to execute, queries are automatically sped up and users observe increased query performance leading to quicker business insights. With new data being ingested into SQL tables, materialized views are automatically refreshed and maintained. Best of all, as more and more queries are being executed, the performance accelerator optimizes and adjusts the deployed materialized views to fine tune view design, all while reducing query execution times.

 

This feature can be enabled with a few clicks within the Synapse Studio. You can simply choose the frequency for executing the process and set the maximum storage to manage the size of the system-generated materialized views and it’s ready to start optimizing your Power BI workload.

 

IgorStanko-SynapseSQL_0-1600727295804.png

 

The Power BI performance accelerator for Azure Synapse Analytics delivers a zero-management experience. It helps system administrators manage materialized views while allowing Power BI users to gain quick and up-to-date business insights.

 

This feature applies to dedicated model. To participate, submit your request here.

 

  • Azure Synapse Link for Azure Cosmos DB now includes Synapse SQL (public preview)

Azure Synapse Link connects operational data stores with high performance analytics engines in Azure Synapse Analytics. Using Synapse Link, customers can perform near real-time analytics directly over their data managed in Azure Cosmos DB without impacting the performance of their operational workloads.

 

Today, we are announcing the public preview of Azure Synapse Link for Azure Cosmos DB using Synapse SQL. This functionality is now available to all customers and is deployed worldwide. Customers can now use a serverless SQL pool in Azure Synapse Analytics to perform interactive analytics over Azure Cosmos DB data enabling quick insights and exploratory analysis without the need to employ complex data movement steps. Thanks to the rich T-SQL support for analytical queries and automatic schema discovery for data, it has never been easier to explore operational data by running ad-hoc and advanced analytical queries. Best of all, due to the rich and out-of-the-box ecosystem support, tools such as Power BI – and others – are just a few clicks away. 

 

IgorStanko-SynapseSQL_0-1600782720143.png

 

This feature applies to serverless model. To learn more, visit the Azure Synapse Link for Azure Cosmos DB documentation.

Note: this functionality will become available in the next few weeks.

 

  • Enhanced support for analyzing text delimited files (public preview)

Despite the availability and popularity of columnar file formats optimized for analytics, such as Parquet and ORC, most newly generated and legacy data is still in text delimited formats. With this in in mind, we are continuously improving the experience for delimited text data. To support immediate and interactive data exploration for this text data, the following enhancements are being introduced:

 

– Fast parser: The new delimited text parser (CSV version 2.0) provides significant performance improvement, ranging from 2X (querying smaller files) to up to 10X or more (querying larger files). This new performance improvement, based on novel parsing techniques and multi-threading, is available to all existing and newly provisioned Azure Synapse workspaces.

– Automatic schema discovery: With automatic schema discovery, OPENROWSET function can be used with CSV files without a need to define expected schema. As the system automatically derives the schema based on the data being queried, users can focus on the needed data insights leading to faster and easier data exploration.

– Transform as CSV: We have extended support for the CREATE EXTERNAL TABLE AS SELECT statement to enable storing query results in the delimited text format. This functionality enables multi-stage data transformation to be performed while keeping the data in delimited text format throughout its lifecycle. 

 

IgorStanko-SynapseSQL_2-1600721600511.png

 

This feature applies to serverless model. To learn more, visit the Azure Synapse SQL documentation.

 

Improve data loading performance and ease of use with:

 

  • COPY command (Generally Available)

Loading data into your data warehouse may not always be the easiest task. Defining the proper table structure to host your data, data quality problems, handling incorrect data and errors, and ingestion performance are among some of the typical issues customers face. We designed the COPY command to tackle these problems. The COPY command has become the default utility for loading data into data warehouses within Azure Synapse Analytics. In addition to bringing the COPY command into General Availability state, we have also added the following features:

 

– Automatic schema discovery: The whole process of defining and mapping source data into target tables is a cumbersome process, especially when tables contain large numbers of columns. To help with this, we are introducing built-in auto-schema discovery and an auto-table creation process (auto_create_table option in preview within COPY). When used, the system automatically creates the target table based on the schema of the Parquet files.

– Complex data type support: COPY command now supports loading complex data types stored in Parquet files which eliminates the previous need to manage multiple computes. When used together with the automatic schema discovery option, complex data types will automatically be mapped to nvarchar columns.

 

These new functionalities are also supported in partner products as well. Azure Stream Analytics, Azure Databricks, Informatica, Matillion, Fivetran, and Talend are among the products and services that support the new COPY command.

 

This feature applies to dedicated model. To learn more, visit the COPY documentation.

 

  • Fast streaming ingestion (Generally Available)

With the rise of IoT devices, both the amount and velocity of the data produced has increased dramatically. To make that data available for analysis and to reduce the time it takes to load and query this data within your data warehouse environments, we are announcing the General Availability of high throughput streaming data ingestion (and inline analytics) to dedicated SQL pools in Azure Synapse using Azure Stream Analytics. This new connector can handle ingestion rates exceeding 200MB/sec while ensuring very low latencies.

 

With Azure Stream Analytics, in addition to high throughput ingress, customers can use SQL to run in-line analytics such as JOINs, temporal aggregations, filtering, real-time time inferencing with pre-trained ML models, pattern matching, geospatial analytics and much more. It supports common formats such as JSON, and custom de-serialization capabilities to ingress and analyze any custom or binary streaming data formats. More details can be found in the announcement blog.

 

This feature applies to dedicated model. To learn more about high throughput streaming ingestion, visit our documentation.

 

Secure your sensitive data using:

 

  • Column-level Encryption (public preview)

As data gets moved to the cloud, securing your data assets is critical to building trust with your customers and partners. Azure Synapse Analytics already provides a breadth of options that can be used to handle sensitive data in a secure manner. We are expanding that support with the introduction of Column Level Encryption.

 

Column-level encryption (CLE) helps you implement fine-grained protection of sensitive data within a table (server-side encryption). With CLE, customers gain the ability to use different protection keys for different columns in a table, with each key having its own access permissions. The data in CLE-enforced columns is encrypted on disk, and remains encrypted in memory, until the DECRYPTBYKEY function is used to decrypt it. Azure Synapse Analytics supports using both symmetric and asymmetric keys.

 

This feature applies to dedicated model. To learn more, visit the  Column Level Encryption documentation.

 

Improve productivity with expanded T-SQL support:

 

  • MERGE support (public preview)

During data loading processes, often there is a need to transform, prepare, and consolidate data from different and disparate data sources into a target table. Depending on the desired table state, data needs to be either inserted, updated, or deleted. Previously, this process could have been implemented using the supported T-SQL dialect. However, the process required multiple queries to be used which was costly and error prone. With the new MERGE support, Azure Synapse Analytics now addresses this need. Users can now synchronize two tables in a single step, streamlining the data processing using a single step statement while improving code readability and debugging.

 

This feature applies to dedicated model. For more details, see our MERGE documentation.

Note: this functionality will become available in the next few weeks.

 

  • Stored procedures support (public preview)

Stored procedures have long been a popular method for encapsulating data processing logic and storing it in a database. To enable customers to operationalize their SQL transformation logic over the data residing in their data lakes, we have added stored procedures support to our serverless model. These data transformation steps can easily be embedded when doing data ingestion with Azure Synapse, and other tools, for repeatable and reliable execution.

 

This feature applies to serverless model.

Note: this functionality will become available in the next few weeks.

 

  • Inline Table-Valued Functions (public preview)

Views have long been the go-to method for returning queryable table results in T-SQL. However, views do not provide the ability to parameterize their definitions. While user-defined functions (UDFs) offer the power to customize results based on arguments, only those that return scalar values had been available in Synapse SQL. By extending support for inline table-valued functions (TVFs), users can now return a table result set based on specified parameters. Query these results just as you would any table and alter its definition as you would a scalar-valued function.

 

This feature applies to both serverless and dedicated models. For more details, visit the CREATE FUNCTION documentation.

Note: this functionality will become available in the next few weeks, post deployment.

 

 

Try Azure Synapse Analytics today

Public Preview: A managed JBoss EAP experience on Azure

Public Preview: A managed JBoss EAP experience on Azure

This article is contributed. See the original author and article here: https://techcommunity.microsoft.com/t5/apps-on-azure/public-preview-a-managed-jboss-eap-experience-on-azure/ba-p/1700210.

For several years, Red Hat and Microsoft have partnered to create cloud solutions that enable enterprises to achieve more. Azure Red Hat Openshift, released in 2019, brought a jointly-managed enterprise-grade Kubernetes solution to Azure. Since 2016, Azure has offered Red Hat Enterprise Linux (RHEL) on virtual machines.

 

You can now run JBoss EAP on Azure App Service. For readers who are not familiar, Azure App Service is a managed hosting service for web and API applications, providing features for auto-scaling, networking, authorization, and more. With App Service, you can deploy WAR and EAR applications using App Service’s deployment API’s or CI/CD integration. Once your apps are deployed, set up auto-scaling to handle periods of higher load.

 

jboss-create.png

 

 

Whether your organization is running a heavily customized, clustered JBoss EAP deployment or has embraced container technologies, Azure has a cloud service to fit your needs. With RHEL on Azure Virtual Machine Scale Sets (VMSS), you can easily lift-and-shift your on-prem JBoss EAP deployments. Azure Red Hat OpenShift combines the innovation of enterprise Kubernetes with the world’s leading enterprise Linux platform, Red Hat Enterprise Linux. App Service now gives Jakarta EE developers the option to leverage a managed Platform-as-a-Service (PaaS) for their cloud migrations.

 

The JBoss EAP on Azure App Service experience has been jointly developed and supported by Red Hat and Azure. Once JBoss EAP on Azure App Service becomes generally available, any support cases concerning the JBoss server will be handled by the experts at Red Hat. Any cases concerning the App Service platform will be resolved by Azure support. In other words, your cases will be handled by the leading experts.

 

During the public preview, JBoss EAP 7.2 is offered on Red Hat Enterprise Linux 7 using OpenJDK 8. A version of JBoss EAP on Java 11 will be available later during the public preview. JBoss EAP on Azure App Service is at current prices. As a preview release, there is no commercial support offering and use of the preview is limited to development and testing of your applications. The General Availability release of JBoss EAP on App Service will include a service fee for the integrated technical support. If you create a JBoss EAP instance on App Service, you will receive an email notice prior to the GA release with more details on the pricing changes.

 

Get started today – try JBoss EAP on Azure App Service.

 

Resources:

What’s new: The new Azure Sentinel Notebooks experience is now in public preview!

What’s new: The new Azure Sentinel Notebooks experience is now in public preview!

This article is contributed. See the original author and article here: https://techcommunity.microsoft.com/t5/azure-sentinel/what-s-new-the-new-azure-sentinel-notebooks-experience-is-now-in/ba-p/1695235.

We are happy to announce the public preview for the new and revamped customizable Jupyter notebook experience running on the Azure Machine Learning (AML) platform for analyzing your security data, all within a secure Azure cloud environment!

 

The new user experience provides an updated interactive UI with Intellisense for improved productivity, support for existing Jupyter and JupyterLab experiences, dedicated notebook compute, as well as point-in-time notebook snapshots and a notebook file explorer for easy notebook collaboration. In addition, take advantage of built-in security analytics via Jupyter notebook templates and the MSTICPy Python library help jumpstart your security analytics and operations.

 

Whether you are a seasoned security analyst with extensive Python and Jupyter experience, or just starting out, you can immediately start experiencing these benefits by adding Jupyter notebooks to your threat defender arsenal. 

 

We highly recommend you check out the Getting started with Azure Sentinel Notebooks video and the official documentation to get started.

 

New intuitive and approachable UI

A new UI experience based on the open source Nteract project. This simple and intuitive UI focuses on delivering simplicity and ease-of-use with full IntelliSense and inline error highlighting directly in your notebooks, drag-and-droppable cells, individual tabs for each notebook, inline toolbars and less clutter. Support for Jupyter and JupyterLab experiences and 10X faster Azure Sentinel notebook launch times.

sample-launch-intellisense.gif

 

Improved collaboration and versioning

Easily share notebooks and other artifacts with other security analysts across your team and/or organization.  A new notebook file explorer to browse your notebooks and your team’s notebooks in one place making it easier to collaborate.  Revert changes or review prior data by using the new check-point feature to take point-in-time notebook snapshots.

sample-checkpoints.gif

 

Managed and flexible compute with additional security features

Pay only for the resources you consume with fully managed dedicated cloud-based compute for executing your notebook workloads.  Terminal access to your notebook compute. Ability to install custom Jupyter kernels (such as PowerShell and C#). Azure Resource Manager (ARM) templates for compute deployments (article). Additional security features such as RBAC and SSH policy options available today with VNET support coming in the fall.    

sample-createcompute.gif

 

Happy threat hunting and investigation!

 

 

 

Power Platform Developer certification is in beta now

This article is contributed. See the original author and article here: https://techcommunity.microsoft.com/t5/microsoft-learn-blog/power-platform-developer-certification-is-in-beta-now/ba-p/1469428.

Microsoft is launching the new Power Platform developer role to focus on  implementing Power Platform solutions, and we need beta participants.

 

Do you design, develop, secure, and troubleshoot Power Platform solutions? Do you implement components of a solution that include application enhancements, custom user experience, system integrations, data conversions, custom process automation, and custom visualizations? Do you have  strong applied knowledge of Power Platform services, including in-depth understanding of capabilities, boundaries, and constraints? If so, this beta exam is for you!

 

The new Microsoft Certified: Power Platform Developer Associate certification has one exam that is currently in beta: Exam PL-200: Microsoft Power Platform Developer.

 

To receive the 80% discount*, use code PL400Isin when prompted for payment.

This is NOT a private access code. You can use this code to register for and take the exam on or before 10/26/2020.

 

*The first 300 people who register can take these exams for an 80% discount! (Why beta exams are no longer free.) The seats are offered on a first-come, first-served basis. You must register for the exam on or before 10/26/2020. Take the exam as soon as possible, so we can leverage your comments, feedback, and exam data in our evaluation of the quality of the questions.

 

Preparing for Beta Exams

 

Taking a beta exam is your chance to have a voice in the questions we include on the exam when it goes live. The rescore process starts on the day that exams go live, and final scores are released approximately 10 days later. For updates on when the rescore is complete, follow me on Twitter (@libertymunson). For questions about the timing of beta exam scoring and live exam release, see the blog posts The Path from Beta Exam to Live Exam and More Tips About Beta Exams.

Remember, the number of spots is limited, so when they’re gone, they’re gone. You should also be aware that there are some countries where the beta code will not work (including Turkey, Pakistan, India, and China). You will not be able to take the beta exam in those countries.

Also keep in mind that these exams are in beta, which means that you will not be scored immediately. You will receive your final score and passing status after your exam is live.

 

Related announcements
Announcing three new Microsoft Certifications for Business Applications

Skill up and stand out, with new role-based training and certification!
New role-based certification and training is here, and we’re just getting started!
Catching up: continuing our journey with new role-based certifications and training