Sage 100 ERP setup on windows server machine - python

We have purchased Sage 100 partner account. I have also set up Sage ERP 100 on windows server 2016. But, I am stuck at the following points.
Where to add business
How to set up web services and access REST APIs
How I will make server configuration
Any help in Sage 100 setup will be appreciated.

Typically you would work with a Sage partner or reseller to set up your Sage 100 environment. Depending on your location, there should be several available. You would typically check the Sage website to see the Sage partners in your area.
With that said, I used to do a lot of programming against Sage 100 and I can tell you that there is no REST or web services API. What you would typically do is deploy your own API that reads from Sage 100 as a database. There is an ODBC connection that is included by default with the product, called SOTAMAS90, that will allow you read-only access to all the Sage 100 tables. The 32-bit connector is installed automatically when you install the program. There is a 64 bit version as well, but that takes more work to set up. The 32 bit version is easiest, but it does require that your API code be running as a 32 bit service or program.
I would typically write C# programs that consume the SOTAMAS90 data and serve it via REST. ASP.NET Web API or Core are both good choices for doing this.
Since the SOTAMAS90 ODBC client is read-only, you will have to do something else if you need to write data back to Sage 100. The two interfaces that I'm familiar with are VI and BOI.
VI, or Visual Integrator is basically a utility for importing data from a source file (typically a CSV). It has some limitations, but it does work. You can launch it programmatically, which makes it usable on-demand. If doesn't throw error messages, however. If a row can't be written, it just skips it. You can view a report after the fact to see what wrote and what didn't.
BOI, or the Business Object Interface, is a COM component that you can code against. It provides more robust data validation, and does throw errors on a per-record (and sometimes a per-field) basis so you can respond to those in your code accordingly. Unfortunately, while most of the modules are exposed the BOI, not all of them are. Every year, Sage is porting more and more functionality to "the new framework" which also means it is available via BOI.
Finally, you can also set up a linked server in SQL Server to serve the ODBC data that way. Any way you hit that SOTAMAS90 DSN though, it's slow. Some developers like to copy all of the data to SQL Server and serve it from there. If you do that, be sure to add foreign keys and indexes. And run a nightly ETL to keep the data fresh. There are also solutions via User Defined Scripts that will allow you to respond to individual row CRUD events.
Hope that helps.
Aaron

Related

Export a MySQL database with contacts to a compatible CardDav system

I have a standard MySQL database, with a table containing contacts (I'm adding contacts to the table using a webapp using Zend Framework), thus with my own fields.
Is it possible to create a server which would be compatible to be used with the Address Book system of OsX? I think I must be compatible with the CardDav system.
Has anyone already done that? If yes, how did you handle it? Created your own server? Is there a CardDav library for Python for example? I just want to be able to read my contacts using the Address Book of OsX.
Thanks a lot for your answers,
Best,
Jean
Is it possible to create a server which would be compatible to be used
with the Address Book system of OsX? I think I must be compatible with
the CardDav system.
Yes you can create such a server and there are plenty already. You can choose between either CardDAV or LDAP, depending on your needs. When LDAP is good enough for your use case, you might get even get away with just configuring OpenLDAP to use your database.
LDAP is usually just read & query only (think big company address book / yellow pages). CardDAV is usually read/write and full sync.
Has anyone already done that?
Many people have, the CalConnect CardDAV Server Implementations site alone lists 16, most of them FOSS. There are more.
If yes, how did you handle it? Created your own server?
I think this is the most common approach.
Is there a CardDav library for Python for example?
Please do your research, this is trivial to figure out ...
Many PHP servers (you mentioned Zend) are using SabreDAV as a basis.
I just want to be able to read my contacts using the Address Book of OsX.
That makes it a lot easier. While you can use a library like SabreDAV, implementing CardDAV readonly is really not that hard. Authentication, a few XML requests for locating an addressbook and then some code to render your existing records as vCards.
If you want to add editing, things get more complicated.

Implementing mBaaS in Python

I am a web backend developer. In the past, I've used a lot of Python and specifically django to create custom APIs to serve data, in JSON for instance, to web frontends.
Now, I am facing the task of developing a mobile backend that needs to provides services such as push notifications, geolocating etc. I am aware of the existing mBaaS providers which could definitely address a lot of the issues with the task at hand, however, the project requires a lot of custom backend code, async tasks, algorithms to perform calculations on the data that in response trigger additional behavior, as well as an extensive back office.
Looking at the features of the popular mBaaS provider, I feel like they are not able to meet all my needs, however it would be nice to use some of their features such as push notifications, instead of developing my own. Am I completely mistaken about mBaaS providers? Is this sort of hybrid approach even possible?
Thanks!
There are a ton of options out there. Personally, I'm still looking for the holy grail of mBaaS providers. I've tried Parse, DreamFactory, and most recently Azure Mobility Services.
All three are great getting started from PoC to v1, but the devil is always in the details. There are a few details to watch out for:
You sacrifice control and for simplicity. Stay in the lanes and things should work. The moment you want to do something else is when complexity creeps in.
You are at the mercy of their infrastructure. Yes -- even Amazon and Azure go down from time to time. Note -- Dreamfactory is a self-hosted solution.
You are locked into their platform. Any extra code customizations
you make with their hooks (ie - Parse's "CloudCode" and Azure's API
scripts) will most likely not port to another platform.
Given the learning curve and tradeoffs involved I think you should just play the strong hand you already have. Why not host an Django app on Heroku? Add on DjangoRestFramework and you basically can get a mBaas up and running in less than a day.
Heroku has plenty of third party providers for things like Push notifications, Authentication mechanisms, and even search engines (Elasticsearch).
All that is required is to drop the right "pip install" code into your controllers and you are off an running.

OLAP Server for NodeJS

I have been looking for ways to provide analytics for an app which is powered by REST server written in NodeJs and MySQL. Discovered OLAP which can actually make this much easier.
And found a python library that provides an OLAP HTTP server called 'Slicer'
http://cubes.databrewery.org/
Can someone explain how this works? Does this mean I have to update my schema. And create what is called fact tables?
Can this be used in conjunction with my NodeJS App? Any examples? Since I have only created single server apps. Would python reside on the same nodejs server. How will it start? ('forever app.js' is my default script)
If I cant use python since I have no exp, what are basics to do it in Nodejs?
My model is basically list of words, so the olap queries I have are words made in days,weeks,months of length 2,5,10 letters in languages eng,french,german etc
Ideas, hints and guidance much appreciated!
As you found out, CUbes provides an HTTPS OLAP server (the slicer tool).
Can someone explain how this works?
As an OLAP server, you can issue OLAP queries to the server. The API is REST/JSON based, so you can easily query the server from Javascript, nodejs, Python or any other language of your choice via HTTP.
The server can answer OLAP queries. OLAP queries are based on a model of "facts" and "dimensions". You can for example query "the total sales amount for a given country and product, itemized by moonth".
Does this mean I have to update my schema. And create what is called fact tables?
OLAP queries are is built around the Facts and Dimension concepts.
OLAP-oriented datawarehousing strategies often involve the creation of these Fact and Dimension tables, building what is called a Star Schema or a Snowflake Schema. These schemas offer better performance for OLAP-type queries on relational databases. Data is often loaded by what is called an ETL process (it can be a simple script) that loads data in the appropriate form.
The Python Cubes framework, however, does not force you to alter your schema or create an alternate one. It has a SQL backend which allows you to define your model (in terms of Facts and Dimensions) without the need of changing the actual database model. This is the documentation for the model definition: https://pythonhosted.org/cubes/model.html .
However, in some cases you may still prefer to define a schema for Data Mining and use a transformation process to load data periodically. It depends on your needs, the amount of data you have, performance considerations, etc...
With Cubes you can also use other non RDBMS backends (ie MongoDB), some of which offer built-in aggregation capabilities that OLAP servers like Cubes can leverage.
Can this be used in conjunction with my NodeJS App?
You can issue queries to your Cubes Slicer server from NodeJS.
Any examples?
There is a Javascript client library to query Cubes. You probably want to use this one: https://github.com/Stiivi/cubes.js/
I don't know of any examples using NodeJS. You can try to get some inspiration from the included AngularJS application in Cubes (https://github.com/Stiivi/cubes/tree/master/incubator). Another client tool is CubesViewer which may be of use to you while building your model: http://jjmontesl.github.io/cubesviewer/ .
Since I have only created single server apps. Would python reside on the same nodejs server. How will it start? ('forever app.js' is my default script)
You would run Cubes Slicer server as a web application (directly from your web server, ie. Apache). For example, with Apache, you would use apache-wsgi mod which allows to serve python applications.
Slicer can also run as a small web server in a standalone process, which is very handy during development (but I wouldn't recommend for production environments). In this case, it will be listening on a different port (typically: http://localhost:5000 ).
If I cant use python since I have no exp, what are basics to do it in Nodejs?
You don't really need to use Python at all. You can configure and use Python Cubes as OLAP server, and run queries from Javascript code (ie. directly from the browser). From the client point of view, is like a database system which you can query via HTTP and get responses in JSON format.

Standalone Desktop App with Centralized Database (file) w/o Database Server?

I have been developing a fairly simple desktop application to be used by a group of 100-150 people within my department mainly for reporting. Unfortunately, I have to build it within some pretty strict confines similar to the specs called out in this post. The application will just be a self contained executable with no need to install.
The problem I'm running into is figuring out how to handle the database need. There will probably only be about 1GB of data for the app, but it needs to be available to everyone.
I would embed the database with the application (SQLite), but the data needs to be refreshed every week from a centralized process, so I figure it would be easier to maintain one database, rather than pushing updates down to the apps. Plus users will need to write to the database as well and those updates need to be seen by everyone.
I'm not allowed to set up a server for the database, so that rules out any good options for a true database. I'm restricted to File Shares or SharePoint.
It seems like I'm down to MS Access or SQLite. I'd prefer to stick with SQLite because I'm a fan of python and SQLAlchemy - but based on what I've read SQLite is not a good solution for multiple users accessing it over the network (or even possible).
Is there another option I haven't discovered for this setup or am I stuck working with MS Access? Perhaps I'll need to break down and work with SharePoint lists and apps?
I've been researching this for quite a while now, and I've run out of ideas. Any help is appreciated.
FYI, as I'm sure you can tell, I'm not a professional developer. I have enough experience in web / python / vb development that I can get by - so I was asked to do this as a side project.
SQLite can operate across a network and be shared among different processes. It is not a good solution when the application is write-heavy (because it locks the database file for the duration of a write), but if the application is mostly reporting it may be a perfectly reasonable solution.
As my options are limited, I decided to go with a built in database for each app using SQLite. The db will only need updated every week or two, so I figured a 30 second update by pulling from flat files will be OK. then the user will have all data locally to browse as needed.

How to make a cost effective but scalable site?

Portal Technology Assessment in which we will be creating a placement portal for the campuses and industry to help place students. The portal will handle large volumes of data and people logging in, approximately 1000 users/day in a concurrent mode.
What technology should i use? PHP with CakePHP as a framework, Ruby on Rails, ASP.NET, Python, or should I opt for cloud computing? Which of those are the most cost beneficial?
Any of those will do, it really depends on what you know. If you're comfortable with Python, use Django. If you like Ruby go with ROR. These modern frameworks are built to scale, assuming you're not going to be developing something on the scale of facebook then they should suffice.
I personally recommend nginx as your main server to host static content and possibly reverse-proxy to Django/mod_wsgi/Apache2.
Another important aspect is caching, make sure to use something like memcached and make sure the framework has some sort of plugin or it's easily attachable.
Language choice is important as you must choose language that you and your team feel the most comfortable with as you must develop mid-large size application. Of course use framework with Python it must be Django, with ASP.NET .NET or MVC.NET whatever you feel better with with Ruby ROR and with PHP there are too large amount of frameworks.
1000 concurrent users is not that much, especially it depends what users will do. Places where users will get large amount of data are better to Cache with with any caching engine you want. You need to design application this what so you can easily swap between real DB calls and calls to cache. For that use Data Objects like for Logins create an Object array of course if you need it. Save some information in cookies when user logins for example his last login, password in case he wants to change it, email and such so you will make less calls to DB in read mode ( select queries ).
use cookie less domain for static content like images, js and css files. setup on this domain the fastest system you can with simplest server you can, probably something based on Linux.
For servers, best advice is to either get large machine and set Virtual Boxes on it with vmware or other Linux based solution or to get few servers which is better because if on big server down you lost everything if one of 1 is down you still can do some stuff. Especially if you set railroad mode. Railroad mode is simple you set up Application server (IIS or Apache) on one server and make it master while you set up SQL on the same server and make it slave. On other server you set up SQL as master and Application server as slave. So server one serves IIS/Apache and Other one SQL, if one down you just need to change line in host.etc in order to set something somewhere else ( i don't know how to do that in Linux ).
last server for static content.
Cloud Computing, you will use if you want it or not. You will share resources with some applications as Google API for jquery and jqueryUI for instance but you create unique application and i don't believe making core of application based on cloud computing will do any good. Use large site's CDNs for good.

Categories

Resources