As an open source enthusiast and a .NET developer I’ve been watching the transformation of Microsoft happen and it has been great to watch. You see I’m an avid user of DotNetNuke and if you know anything about DNN’s history you know that DNN was one of the earliest, if not the earliest, open source project in the .NET Ecosystem. From 2003 on DNN has been a pioneer in the .NET open source world.
A lot has happened and several trends have come and gone in the Microsoft world since 2003. As an open source project built on Microsoft technology the notion of being open source wasn’t always a popular conversation topic. Being open source wasn’t “cool” and sometimes negative perceptions about open source solutions were visible.
Boy have times changed!
Microsoft is Serious About Open Source… and It’s Not Just Lip Service
One of my college football coaches always said “Your words don’t mean anything, but your actions mean everything.” Actions are a really good sign of what someone really believes.
Microsoft’s strategic shift to embracing and focusing on open source over the past few years has been such a refreshing transition to see, feel, and experience for me and my fellow DNN’ers because of the actions we are seeing.
If we look at the recent and strategic moves Microsoft has made it’s easy to see that Microsoft is indeed serious about open source. If you aren’t convinced that Microsoft is serious about open source or if you are not keeping up, let’s look at some of the actions Microsoft has taken related to open source. And these are just the ones I have observed… I’m sure there is even more evidence out there.
Why It’s a Great Time to Be a .NET Developer
There has never been a better time to be a .NET Developer. Literally everything you need to get started building is online and free to use and even better it’s likely open source. Anybody, anywhere can download code, look at it, enhance it, modify it, and submit it back to the projects if desired. If you can dream it, you can build it and you may build an online team of users and contributors to assist you in the process. Microsoft is literally making it easy to build open source projects via the technologies and resources they are providing. They are removing roadblocks for developers and being 100% transparent.
Consider the following capabilities anybody, anywhere has...
I referenced one of my college football coaches earlier, but he wasn’t the only one to to impart wisdom during my athletic days. My high school coaches had more one-liners than anyone could remember. One line that stuck with me was “If you do the little things, the big things will take care of themselves.” Microsoft is not only doing the big things, but they are also doing the little things that continue to reinforce their seriousness about open source.
We are watching a culture and paradigm shift occur in real-time and it’s awesome. By going “all in” on open source Microsoft is not only winning the hearts of developers, but they are making it easy for developers to get started with their technologies! I think the strategic decision to embrace open source will have a big impact for Microsoft in the long term.
Last week I attended Microsoft’s Build Conference in Seattle. I was helping at the .NET Open Source booth which promoted the .NET Foundation and all things open source. The conference was very nice, and the energy level was high. I had conversations with a wide variety of people during the conference and it is obvious that Microsoft’s strategy of embracing open source is welcomed by developers.
During one of my discussions a gentleman told me that his organization uses open source software (OSS) and he wants to allow his developers to contribute to OSS, but he needed to be able to justify it to his corporate leadership. His organization is a large, global organization so he needed solid and clear reasoning for why contributing to OSS is something his company should support.
He asked me if I knew of any blogs or resources that could provide insight into this topic. I thought about it and while I’m sure there is info somewhere, I wasn’t aware of any specific blogs or content about this subject. I am obviously biased about this topic, but let’s consider some reasons why a business should support OSS… especially if their organization is using OSS-based products.
Before we list out reasons we should first define what “support” means. When business people hear the term “support” they generally think about money, cost, or financial implications. Though, in the open source world it’s not necessarily about money as support can come in many different forms. Of course, the obvious need for any OSS project is code contribution, but there are more ways to contribute than one may initially think. As examples outside of the code, organizations could allow their developers to assist in marketing and promotions of sub-projects, conferences, user groups, GitHub repos, project documentation etc. Developers could also volunteer in any area of the OSS project as well as exchange knowledge online via forums, blogs, StackOverflow, and others. Organizations could also open up their offices for user group meetings, donate swag & door prizes, or sponsor the food at meetings. Any step taken to help move the the OSS project forward is a form of support.
Now that we know that support can come in forms outside of financial contributions let’s get back to the subject. If you are faced with the need to justify supporting open source software to your business leadership here are some thoughts and ideas to consider:
In this blog I’ve summarized my thoughts around why it’s important for organizations to give back, be active in, and support OSS projects and communities. As one considers justifying OSS participation to the business side of an organization much of the conversation will center around educating the business-side on how OSS ecosystems function. Communicating the potential positive benefits will be what’s needed to help bring on a change in perspective or cultural shift within the organization.
In my mind there are only positives to gain from contributing to OSS projects. Your developers will learn more, be empowered, meet new developers of all ages and skillsets, and your organization will be more efficient, and will likely be viewed as a great organization to work for.
If you don’t want to jump in head first then just try this one small thing to get your feet wet - if your developers have “down time” then simply encourage them focus their energies and time to assisting with the OSS project in any area they choose and watch what happens to your company in the months ahead. Be sure to pay attention to job satisfaction levels, quality of incoming new hires, general passion for work, and the perception of your organization among developers in your space.
After all, have you noticed that OSS projects that thrive are the ones with active community support? Who doesn’t want the project they use to not thrive? From my perspective the benefits of contributing to open source software far outweigh the drawbacks of not contributing.
Last week I attended Microsoft’s Build Conference in Seattle. It was my first time attending so I was excited and didn’t know what to expect. It didn’t take long to realize that Microsoft puts on a top-notch event. From DJ’s playing music during the waiting line, to the constantly available live-stream piped everywhere throughout the event, to “cuddle-corner” where attendees could pet animals and relax, to the awesome expo, and non-stop new features and functionality being rolled out one could easily be impressed.
It was indeed a great event and I’d like to share a few things I learned from having attended the conference. These items will be more high-level and conceptual things I noticed versus down-in-the-weeds technology specific items.
As attendees listened to live-streamed sessions and keynotes the word cloud and “Azure” was prevalent throughout. I spent a lot of time in the expo hall of the event and I must have walked around it 6 or 7 times looking and booths and talking with Microsoft staff. Each booth has a navy-blue sign with white letters at the top indicating the technology being demonstrated at the booth. It was very eye-opening how many of the booths started with the word “Azure”. Sure, Azure has been out for a while now and that’s nothing new. I’m just communicating that walking around the expo and listening to sessions and keynotes it is crystal clear that Azure is a major component of many Microsoft technologies.
Take Home Point: If you are reading this and are hesitant to embrace Azure, you should re-think your position, or you’ll soon be left in the dust.
At the event Microsoft released “Sphere” which is a solution for creating highly-secure connected microcontrollers (IOT devices). And you guessed it… they connect to Azure! Many of the highly attended sessions and one of the most highly-trafficked booths all centered around Sphere.
As an IOT hobbyist I spent some time at the Sphere booth asking all kinds of questions about the Sphere Development Kits. I think the devices will be high-powered and offer a lot of functionality, but right now the price-point seems high in comparison to competitor solutions and the device that was being demo’d only connects to wi-fi currently. I imagine in the future they will connect to cellular via a SIM as well. One could argue that the increase in cost is the tradeoff for security as Microsoft touted how secure these devices are.
While micro-controller devices were being demo’d it’s important to note that Microsoft is not making the micro-controllers, rather they are working with established vendors in the industry to do so. Microsoft is collaborating in the design of the devices and helping align them with their IOT strategy for the future.
Take Home Point: Microsoft is continuing to invest in IOT, is linking devices to Azure, and is promoting the security of their IOT solution.
Another common thread throughout the event were the words “ML” and “AI”. You could hear this being presented in several sessions, keynotes, and there were booths discussing and demoing these topics as well…. and it makes sense. If your overall strategy is the cloud (Azure) and now IOT devices are easily connected to the cloud and sending tons of data to the cloud, then what will you do with all the data? The answer: you will learn from it and use it to make better decisions and become predictive.
An example demo showed a DJI drone flying over the top of a building looking at HVAC pipes analyzing them for inconsistencies or anomalies. Within seconds the drone was able to pick out the pipe that had the issue and show it to the audience in real-time. One can easily see the benefits of equipping the drone with AI capabilities.
But it doesn’t stop there… Microsoft is making it easy for developers to tie into their ML and AI capabilities in Azure. If you’ve got data stored in Azure, chances are leveraging the ML/AI capabilities offered to you by Microsoft could help your organization.
Take Home Point: Don’t write “ML” and “AI” off as just buzzwords. If you are using Azure then you may be surprised at how ML and AI can already help you. Give it a look!
One thing that also stuck out to me at the conference was the energy and level of enthusiasm of attendees and exhibitors. And I’m not just talking about Microsoft fan-boys. There were a lot of non-Microsoft developers at the event which was interesting and served as proof to me that Microsoft is on the right track strategically speaking.
As of late Microsoft has been heavily promoting open source and being open in general. From the open-sourcing of many of their .NET Technologies, to embracing non-Microsoft technologies (think running Linux in Azure) Microsoft is earning the respect of developers. But this stuff doesn’t just happen by luck. Microsoft has taken a different strategic stance and it is paying off… you could easily “feel” it while at the conference.
Take Home Point: This ain’t your granddaddy’s Microsoft!
I’m glad I went to Microsoft Build. I’ve been to several conferences over the years (including South by Southwest) and Build was by far my favorite. Yes, the content was great, but the conference experience extended beyond the content of the sessions and was woven all throughout all touch points of the event.
Everything was well-planned, organized, and first class from what I could tell. The registration process went smoothly, swag was everywhere, food and drinks were easily available, “Flow” of the expo was easy and open, the venue was great, several hotels were close by, the new technologies and content was awesome, and the Microsoftees were friendly. I didn’t see any attendees who had issues connecting to the internet or complaining about the typical things you’d see at conferences. All the details seemed to have been handled.
After all, where else could you hear about the latest and greatest technologies, pet dogs/rabbits/miniature horses, get free massages, meet the leaders of Microsoft, and have bottomless refreshments and snacks all in the same room?
Take Home Point: If you haven’t been to Build, you should go. It’s a great event.
If you keep up with the blog or the SC Hog Removal page you know we’ve been getting calls from local farmers with hog problems. We’ve been staying after these hogs as it seems they can reproduce nearly as fast as we can get them off a farmer’s property. It’s a full-time job to keep them at bay and we are having fun with it.
Big & J Hog Attractants
We have been using the Big & J hog attractants “Hogs Hammer It” and “Pigs Dig It” in combination with corn and I can tell you that the hogs do like it! When they come in they stay until all the corn is gone and leave the place looking like a tractor had plowed through it. Here again leading up to this hunt we’d put out the corn and attractants and hoped things would line up.
Labor Day Weekend
I had to hang around for a day or so this Labor Day weekend and so why not see if the hogs were moving I thought. It was also the first day of deer hunting season in my game zone so I went deer hunting before dark, got some food afterwards and then headed out for hogs.
As it was a holiday weekend some of my hunting partners were unable to go, but at the same time some of my friends were back at home for the holiday. I was able to talk Garth Knight into going hunting with me. I let him know the hogs had been acting oddly lately as far as their feeding schedule so I was not sure what would happen.
A Short Hog Hunt!
Garth and I set up overlooking a field that was not far from a swamp. We’d been getting hogs on camera at all hours of the night. Sometimes they would be solo and sometimes they’d be about 15 of them so I didn’t know what to expect. We got there and got setup around 9:15 or so. I was telling Garth about all the lessons we’d learned with night vision technologies, guns, and the way the hogs had been acting lately.
Every few minutes I checked the bottom of the field looking for heat signatures. We’d been there about 45 minutes when I was telling Garth about how the scope can live-stream hunts to the phone. I got up to turn the Wi-Fi on and as I looked through the scope I saw some bright spots coming through the woods. I told him they were on the way! So we finished streaming the video to the phone and just watched as the hogs approached.
I wanted to give the hogs a few minutes to ensure there were no more coming because sometimes there would be large groups trailing the hogs. So we watched the hogs eating the corn for a few minutes. Nothing seemed to be coming behind these hogs so I decided it was time to take action. I asked Garth if he wanted to shoot and he said he’d hold off this time. It took me a little bit to pick out which hog was bigger and I flipped into black hot mode once to see if that would help. Finally, I was able to figure out the hog on the left was the bigger hog and I told Garth to get ready.
A few seconds later, thanks to the Anderson Rifles AM-10 308 Hunter + Pulsar Trail XP 50, the bigger hog was on the ground! Not bad for the first day of deer hunting season right ??
I’ve recently been researching Splunk and have been impressed with its power, flexibility, and ease of use. This blog is not intended to be a step-by-step tutorial, but rather is aimed to show some initial findings, overview one way to integrate Splunk with DNN, and paint the picture of some potential use cases.
So What is Splunk?
If you don’t already know what Splunk is, Splunk is a software company based in San Francisco that produces software for searching, monitoring, and analyzing machine generated big data via a web style interface. Splunk’s software helps organizations with operational intelligence, log management, application management, enterprise security and compliance.
Installing Splunk was simple and after clicking around a little while it was evident that Splunk is an intuitive software. From a UI standpoint, it makes logical sense and the flow is easy to understand. And it didn’t take long to see and understand how powerful it is.
As you may imagine, I began to wonder if and how I could integrate Splunk with DNN.
DNN + Splunk: One Way to Connect the Two
One of Splunk’s powerful features is that it can literally suck in all types, styles, and formats of data. This data can be machine data, log files, or even data from a REST API. There are several mechanisms for getting data into Splunk, but for this scenario, DNN’s web API implementation makes this an easy fit. On the DNN side, a developer can easily create a custom module using web services to expose any DNN data on an endpoint, which Splunk can then access. If you’d like to go the custom module route, check out my other blog series on module development. However, I did not write a custom module to test the integration.
For my initial investigation into Splunk I chose to use DNN Sharp’s API Endpoint module as it allows easy configuration of end points. Splunk is architected to consume any type of data and then it makes that data extremely easy to search, create visualizations and/or alerts with. These searches, visualizations, and alerts can be very basic or very complex in nature.
Another thing to note is that Splunk is architected to do this at scale and can easily parse enormous amounts of data. For example, every time you drink from a Coca-Cola “Freestyle” machine at a fast food restaurant, the data from your drink selection is logged and Splunk helps analyze the data, denote trends, and sends alerts. So yes, those Coke machines (all across the world) are connected IOT devices and Coke is a Splunk customer. See how Coke is using Splunk in the Splunk Conf 2014 Keynote replay session. Imagine how much data that is on a global scaled --> Splunk is helping Coke make sense of it.
Side note: Check out the blog I wrote on using Particle & Splunk to monitor temperature
So, my first goal was simple: see if I could get data from DNN into Splunk.
Sticking along the thought process of “data logs” I figured why not expose the DNN event log on an endpoint and see what I could make happen. Obviously, the event log may not be the best use case as site administrators can clear logs or processes to automatically clear logs sometimes exist. However, for this initial test it is a good candidate. To get the event log data on an end point I used the DNN Sharp API Endpoint module to make a SQL query on the event log view and return it as JSON.
With the event log now sitting out there as JSON on a DNN end point now all I needed to do was get it into Splunk…
Getting REST Data Into Splunk
The Splunk side of this configuration only took a few minutes to configure and keep in mind I’m no Splunk guru (read, it’s easy!). Splunk is similar to DNN in that it’s extensible. Splunk extensions can be found on the Apps and Ad-Ons sections of the Splunk website. I tell you this because ultimately, I followed a blog by Damien Dallimore on getting REST data into Splunk which used a modular input extension and that was all it took. I simply completed the required fields in the Splunk REST Modular Input as shown below.
I chose to poll the data every 60 seconds. With this information inputted I clicked save and returned to the Data Inputs screen of Splunk and chose my newly created data source.
BOOM! I was seeing DNN event log info in Splunk!
Searching, Visualizations, & Alerts in Splunk
With data in Splunk now I needed to proceed to using Splunk to make sense of the data. Splunk’s searching functionality makes it very easy to search for, well... anything you'd like. I’m not yet knowledgeable enough to fully explain all the capabilities, but what I can easily see is that you can select your data source, click on keywords, add them to the source's search criteria and set your desired timeframe for the search. It’s feels as if you have a Google search bar and all your searches are performed on your data source and intellisense & syntax highlighting for your search are provided too!
Once you have a search returning data you can then create visualizations or alerts. And yes, there are tons of visualizations provided by Splunk. These visualizations can be saved as reports or live as “panels” that reside on dashboards. Dashboards can have as many panels as you want and you can have multiple dashboards if you like. Also, you can easily embed these panels into DNN or any other location by clicking the “convert to HTML” link that each panel has. Being able to display this info anywhere you like is a neat feature. Are your mental light bulbs turning on yet?
So, I created a few visualizations based on event log data that was available. I created a number-based-visualization to show a large number that represented a count of 404 errors, a line graph showing the number of failed logins, and a chart showing the 404’s over time. So, in just minutes Splunk was already helping me understand that I have some issues going on with one of my sites. I believe one reason for the 404's is that I've renamed some pages that I think bots are targeting trying to register. Anyways, I've got work to do... don't judge!
Opening Up Possibilities
Now you may be looking at this and thinking to yourself, yeah this is neat, but I could create a custom module to make something similar to this happen. And you would be correct, but keep in mind the potential use cases, flexibility, and scalability of Splunk in comparison to a custom module. You could easily have all your customers as data sources and create dashboards to help you (and your customers) quickly understand what’s going on with your customer's applications. You could also do data mashups of data from a DNN website/web app, some IOT device out in space, and any other data source you can think of to provide valuable insight. And again, Splunk has no problem doing this with massive amounts of data.
With just a little research into Splunk it didn’t take long to get my mind spinning with all the possibilities within DNN and beyond. Think about your current DNN use cases, requirements of your customers, and the exploding IOT market and you’ll soon see the light.
Here are some ideas I had right off the bat:
As you can see the power and flexibility Splunk provides is really nice. I believe Splunk could be a game-changer especially for those with large amounts of data to parse, anybody in the IOT space, and much more. I hope this blog has provided you with an introductory glimpse into some of the capabilities of Splunk and even got you thinking of potential ways to integrate Splunk into your applications or customer's environments. I am still learning about it and hope you will too. I know that I'm just scratching the surface here in my initial findings.
Find out more about Splunk at http://www.Splunk.com
In the past few years I’ve been getting into IOT. You may have seen tweets or blog about the Tech-Turkey project I’ve been working on or flame throwing pumpkins at Halloween. I’ve learned and used Arduinos, Raspberry Pi’s, and Particle Photons and Electrons. It has been fun to learn more and get into the connected world… the internet of things!
I’ve been keeping ServoCity in business and even recently worked to get a custom PCB created. Every step of the way I’ve been learning different things and realizing just how much more there is to learn. Recently I’ve started learning more about Splunk.
What is Splunk If you don’t already know what Splunk is, Splunk is a software company based in San Francisco that produces software for searching, monitoring, and analyzing machine generated big data via a web style interface. Splunk’s software helps organizations with operational intelligence, log management, application management, enterprise security and compliance.
Side note: In my first exploration into Splunk I wrote a blog about using Splunk with DNN that may interest you.
Particle & SplunkDNN is a web application, but what if I wanted to get data from an IOT device? That’s when we call on Particle. If you’re not familiar with Particle, it makes it really easy to bring real world objects online. Particle is one of my favorite IOT platforms. It makes awesome microcontrollers, provides a nice IDE, has awesome documentation, and a great community. Connecting to Particle’s cloud is straight forward and even southerners can do it! See my presentation at our user group on DNN & Particle.
If you’re not familiar with Splunk, it makes it really easy to pull in data (machine data) and make sense of it. I’m talking about parsing vast amounts of data, creating visualizations and/or alerts and making it simple to understand. Even southerners can use it too!
Both Particle and Splunk are industry leaders and have some really big names behind their companies and as clients of their companies.
So why not bring Particle & Splunk together?
Reading Temperature with ParticleTo use the awesomeness that both solutions bring us we’ll first need to read the temperature and post it to a webservice. Here again, Particle makes this easy. I used a basic temperature reader in a bread board layout for this experiment.
Then, in Particle’s IDE I used the basic tutorial level code to read an analog value and post it to a Particle cloud variable. Cloud variables are accessible via web services. That is, I can make a GET request and parse the JSON object to get the data. Epic.
Now we were cooking with oil! The next step was to get this data into Splunk.
Getting Particle’s RESTful Data Into SplunkGetting RESTful data into Splunk is really straightforward thanks to Splunk’s extensibility. Splunk has an extensions gallery that can be found on the Apps and Ad-Ons sections of the Splunk website. I tell you this because ultimately, I followed a blog by Damien Dallimore on getting REST data into Splunk which used a modular input extension and that was all it took. I simply completed the required fields in the Splunk REST Modular Input as shown below.
After clicking save, the data from my Particle temperature reader was showing up in Splunk!
Creating Dashboards from the Particle DataOnce data shows up in Splunk you can literally perform any search query you want on the data and create/configure dashboards, panels, reports, alerts and more. Splunk is very powerful in this regard and scale to infinity. However, for this scenario I just wanted log the temperature over time from one device, as well as the temperature’s highest, lowest, and average. Splunk, again, made this very simple.
After clicking on the “result” field I created some visualizations and voila! Out popped some neat dashboards showing all my data in a way that’s easy to understand.
If you are like me, you kind of want to see things in action. So for those of you like me who are visual learners, here’s a quick video of the solution in action.
An IOT Combination That’s Hard to Beat!As you can see, both solutions are awesome and the opportunities are endless. Consider the possibilities here… Particle is easy to deploy and post data to the net and Splunk can easily connect, suck in data, and bring instant insights. The more data you give to Splunk the more knowledge you’re going to have. Splunk can handle this at scale too… I mean massive scale. Why not connect thousands of devices and pump all the data into Splunk and tune it to your liking! I believe that’s what they refer to as operational intelligence 😊 Now my mind is spinning with possibilities. Is yours?
The Central High School Eagles of Pageland, South Carolina have a rich tradition and history of success. A few years ago my friend, Jason Fararooei, a video producer from the Charlotte area, took a liking to the program. Over the years, Jason has made some really great videos for the eagles. If you haven’t seen them then check out 3:17 and the Eagle Tribute Video.
With so much recent transition going on at Central, we decided to make another video to try and create energy and enthusiasm around the program. Our hope is that the new head Coach, Trent Usher, will get the program back to where it used to be.
We had quite the eventful weekend last weekend. If you read the “Big & J Hogs Hammer It and Pigs Dig It Helps Get Rid of Nuisance South Carolina Hogs” blog that posted on Monday then you are aware of the local farmer who had reached out to us to assist with his hog problem. Although we expected multiple hogs to come out on the first hunt we only ended up seeing one.
So we returned for another hunt a day or so later…
The hogs had stayed away for a day, but on day 2 they wiped out all the remaining corn that was saturated with Big & J Hog attractant. The farmer notified us of what the hogs had done overnight and so we knew we needed to be back down at the farm sooner than later.
After replenishing the corn, I went down to the farm on a solo hunt as my hunting partners were unable to come on this specific night. The farmer sat with me and we watched the corn pile for a while and were ready to handle business. However, nothing moved just after dark. We sat and strategized what we would do when certain hogs arrived, but nothing was moving. The farmer had to pack it in for the night so I remained on the gun watching the field.
Shortly after the farmer left 3 deer came out and grazed through the field. I watched them for a while in the scope. Then 2 more deer entered the field. Interestingly, the deer did not eat the corn that had the Big & J hog attractant on it (which is a good sign to me!). Eventually the deer exited the field into some nearby woods.
From Reading a Devotional to Shooting a Hog
I was reading a devotional on the bible app and I would stop every couple of minutes and scan the field. I’ve hunted hogs enough to know that the hunt can change in an instant because these hogs don’t hesitate too much when they come into a field and they move more quickly than you might expect. I read and scanned, read and scanned, and towards the end of the devotional I noticed a blob of heat on the corn! While I was reading, a group of hogs, 1 female and several piglets, had gotten out into the middle of the field.
I knew it was game time.
I got in the gun and watched this group for a few minutes. I scanned the edges looking to see if any more were nearby or entering the field. I didn’t see any sign of other hogs coming in so I continued to watch. I knew I was going to shoot the big one, but it was just a waiting game.
I don’t like to shoot in the middle of a white blob of heat because it’s hard to tell exactly what you’re aiming at and sometimes the piglets are taller than you think. Translation: I didn’t want to get a piglet and miss the big one so I waited on the right opportunity to present itself. I needed the big hog to separate herself far enough so that I could get a silhouette of her body and know where I was aiming.
While I watched them feed something funny happened. One of the piglets went behind the female and the larger female cut the piglet a flip! She kicked the piglet and it somersaulted backwards and when it landed it just got right back up and kept rooting. It was pretty funny. I couldn’t believe what I’d witnessed.
A few seconds later the large female advanced forward aggressively and this singled herself out. It was just the sight I was waiting for. I flipped the safety off and squeezed the trigger really slow. The Anderson Arms AM-10 308 that I have has a long trigger pull and in hopes of not flinching on my shot I always try to ensure the gun surprises me when it goes off. I hope for the smooth trigger pull. I put the cross hairs on this hogs shoulder and squeezed off.
The boom echoed through the field and down to the creek.
The large hog instantly fell and within a second the piglets scurried out of the field. Since the large hog was on the ground, my job shooting was essentially done. I waited a while and started loading up the truck.
Loading a Hog By Yourself Ain’t Easy
I took the shot at about 11:58 and with my hunting partners not around it was me… and well me… that had to load the hog up. When I got down to the hog I realized she was bigger than I thought. Getting her in the truck wouldn’t be as easy as it normally is when you have help.
Ultimately, I ended up dragging the hog to the side of the field and then walking up the bumper to the tailgate with one of the hog’s legs in my hand. When I got in the bed of the truck the weight of the hog was very heavy to hold on to so I had to essentially lay down on my stomach and grab the other leg with my other hand. With both legs in hand I then had to figure a way to stand up. It reminded me of a dead lift that we used to do in high school and college football except this was more awkward and off balance. If you would have seen me you would have laughed, but once I got my feet under me I was able to pull the hog in the truck using the tailgate as a lever. I hope that’s the last time I have to load a big hog up by myself!
And since there was no one there to take a pic of me and the hog I had to take a hog selfie!
It was a great hunt and yet another nuisance hog is in the freezer at the processor!
Do You Have Hog Problems?
If you have hog problems we’re happy to help. Learn more about how we are helping land owners and farmers with their hog problems on the SC Hog Removal page.
Another South Carolina Farmer With Nuisance Hog Problems
We’ve recently been in communications with another local farmer who’s crop were being demolished by hogs. On this specific farmer’s land, the hogs showing up and rooting his crop fields was a new occurrence. Frustrated and not exactly sure of how to solve this problem the farmer asked us how quickly we could help him out. Within a day we had game cameras set up and were getting recon on the hog’s pattern on this specific property.
Big & J Hog Products Help the Hunt
In this setup the area where the hogs were showing up was narrow in nature. The field makes kind of a point where the hogs have easy access and had been rooting. This meant we most likely wouldn’t get multiple shots and would need to get the hogs to the middle of this area of the field.
To coax the hogs into the middle of the field we used something that would be memorable for them, Big & J’s new Hog attractant products. We spread both Hogs-Hamer-It and Pigs-Dig-It on top of corn in the middle of this point in the field. And it didn’t take long before we had them coming in and loving what Big & J’s products had to offer!
Only One Hog Came Through
Due to the amount of damage we’d been seeing on this property we anticipated seeing several hogs, but on this hunt, it didn’t play out that way. The wind was not in our favor and was blowing pretty strong. We sat for a while and shot the breeze. Early in the night we had a deer that kept walking through the field and right around midnight we had a solo hog come in and go straight to the Big & J hog attractant marinated corn pile!
For us it’s rare to see a solo hog like this unless it was a really big male. So we waited thinking that more would eventually come out. And we waited and waited and waited. It seemed like forever, but it was probably around 10 minutes or so. Evidently the hog was there by itself. We decided to go ahead and pull the trigger because we didn’t want that one to get out of there before we could get a shot off and nothing else seemed to be showing up.
As you can see on the video below, the Anderson Arms 308 with Pulsar Trail XP50 made quick work of this hog. The hog flopped on the spot and our tracking job was easy! We loaded her up, took some pics, and took her to the processor.
Another nuisance South Carolina hog headed to the freezer.
Have you been to a Central High Eagles football game lately? Central Eagle football has a history of success, championships, and hard-nosed football. Did you happen to notice the old and outdated sound system present at the stadium? What we don’t have a history in… is an awesome sound system at the stadium! Of course, the sound system at the stadium was a great setup – 20 years ago, but technology has come a long way since then and it’s time to upgrade!
Change Desired: An Energized Environment
New head coach, Trent Usher, has stated that one of his desired changes is to create a new and energized environment on Friday nights. Obviously, this will depend on fans, the booster club, cheerleaders, the band, etc., but one way we can help create this environment is to put a new sound system in place. Yes, this is a big project to undertake, but it is one worth undertaking and would enhance the environment at the games and be enjoyed by everyone.
I’m posting this blog in hopes of helping get the word out about this project.
The Outdated System
Seriously, what year was the old system installed? It’s been a while. If you’ve been to a game lately then you know the sound system is not the greatest. If you’ve ever been inside the press box during the game then you may have seen how music is played… they literally play music on an iPhone and hold the microphone from the sound system up to the iPhone to get songs to be played over the loud speakers! So yes, we have room to improve.
The New System
We’ve reached out to multiple vendors and have settled on working with Verge Multimedia for the new system. Verge has provided a quote that entails removal of the old sound system, installation of the new sound system, and training on the system. The system has several components, but has 3 primary large speakers which will be placed on the press box. You can see the visuals from the proposal included below.
The bid total is $12,500. This amount has factored in a discount for 501 c3 Tax Exempt organizations since this will be working through the Football Club which is affiliated with the school.
Interested in more details Download the full proposal.
How You Can Help
Yes, this will take money, $12,500 to be exact. We are reaching out to former students, athletes, alumni, and local businesses in hopes of gaining the needed financial support for the new system. (So help us spread the word in all your social channels) The presence of the new system will reflect the support for the Eagles and sign of a community engaged with their local high school.
To contribute you’ll need to write a check or give cash to the Touchdown Club. These can be sent to the school or given to Courtney Usher. Please note that if you would like a receipt (for tax write-off purposes) we can provide you with one.
Want an official document on Central High School letterhead? Download the Fundraising Document
If you have questions or want more info on this just contact me on Twitter or here through my contact page or reach out to Courtney Usher on Facebook.
Help us take steps to restore the tradition of winning, success, and a great environment at Central